Overview

Resources are one of the key building blocks of all pipelines. They are information entities that are used for storing and exchanging information across steps and pipelines. 

Resources are versioned and each version is immutable. They are also global and depending on the scope defined for the pipeline source, they can be available across pipelines, which enable you to connect multiple pipelines together to create a pipeline of pipelines.


Resources are pointers and they can be used to reference:

  • A repository in your source code control system, such as GitHub
  • A file on a remote file server
  • A Docker image
  • A release bundle for JFrog Distribution
  • A cluster for container orchestration


Using resources in your pipeline involves two main steps:

  1. In the pipeline's YAML, in the resources section, define all the resources required for running the pipeline.
    After a resource is defined, it is available for use in the pipeline, based on the scope defined for the pipeline source.
  2. In the steps section, as per your workflow, add these resources as input and/or output.


Resource Scope and Visibility

Pipeline resources have the same scope as the pipeline source where they are defined.  

When defining resources, the recommended approach is to define them in a pipeline source that is shared across environments in the same Project. This ensures that the resources are available across environments in a project. For more information, see Creating a Project - Pipelines Resources.

Currently, pipeline resources cannot be shared across Projects.


Resource Types

Resources play different roles based on where and how they are used:

  • Triggering Resource: Whenever there is a change in this type of resource, it triggers the dependent step. Examples: CronTrigger, GitRepoIncomingWebhook.
  • Generated Resource : These are resources that are generated by a step and they can trigger successive downstream steps. Examples: Aql, BuildInfo, FileSpecImage.
  • Webhook ResourceIncomingWebhook and OutgoingWebhook are resources that can be used to integrate a pipeline with third-party services.

Pipelines supports several types of resources, with each resource capable of serving a specific activity:

TypeDescription

Aql

An Aql resource specifies an Artifactory query using Artifactory Query Language. 

BuildInfo

A BuildInfo resource is the metadata associated with a build in Artifactory.

CronTrigger

A CronTrigger is used as an input resource as a step to trigger execution of the step at a scheduled time.

DistributionRule

A DistributionRule resource is the set of Destination rules that can be applied to distributing a release bundle using JFrog Distribution. 

FileSpec  

A FileSpec resource specifies a File Spec, which provides the details of files to upload or download from Artifactory.

GitRepo

A GitRepo is used to connect JFrog Pipelines to a source control repository. It creates a webhook to the repo so that future commits will automatically create a new version with the webhook payload.

HelmChart

The HelmChart resource maps to a specific chart in an Artifactory Helm Repository.

Image

An Image resource is used to add a reference to a Docker image to your pipeline.

IncomingWebhook

An IncomingWebhook resource can trigger one or more jobs in your pipeline whenever the associated URL is called using the HTTP POST method. 

OutgoingWebhook

An OutgoingWebhook resource uses HTTP to send information from a step to an external API endpoint through an Outgoing Webhook Integration.

PropertyBag

A PropertyBag resource is used to pass information from one pipeline to another and provide environment variables to a step in the format of a resource.

ReleaseBundle

A ReleaseBundle resource specifies a set of artifacts in Artifactory that are distributed to Artifactory Edge nodes as a JFrog Distribution Release Bundle.

RemoteFile

A RemoteFile resource enables using a file on a remote file server. 

VmCluster

A VmCluster is used to represent a set of virtual machines. It is mainly used to deploy services/apps to the specified clusters and in some cases, it can be used to run certain maintenance activities on the clusters as a whole.


Resource Versions

One of the key features of Pipelines resource is versioning. Resource versions are useful for controlling the flow of pipelines and tracking the changes that a resource undergoes over time. You can trigger runs using specific versions or skip steps in the same run when input resources are not updated.

Every resource starts with an initial version, which is updated every time the resource changes. For example, the version of an Image resource used in a pipeline is updated whenever a new tag is pushed to a Docker image. Pipelines tracks these changes by updating the resource version, which is based on the metadata received for that particular resource. 

A new resource version is created when:

  • The resource definition is updated in the pipelines YAML file.
  • An output resource is updated during a run.
  • There is an external event, such as pushing a commit to a git repository.

Each version of a resource is immutable and returns the same result every time a specific version is used for a run. By default, steps always run using the latest version of an input resource. However, because a resource is versioned and holds the entire history of all the available versions, a run can be customized to use a specific version of a resource. For more information, see Triggering a Run with Custom Parameters.

Resource versions have the following default behavior:

  • Latest version: When a run is triggered, the latest version of the input resource is used in that run.
  • Dependent step in the same pipeline: During a run, even when an input resource is not updated (no new version), the dependent step is triggered. While this is the default behavior, it can be changed by setting the newVersionOnly tag as true.

    This is applicable only for generated resources, which are resources that connect two steps of a pipeline. 

  • Dependent step in another pipeline: During a run, the dependent pipeline is triggered only when there is a new version of the resource that connects the pipelines.

Example - Resource Versions

This is a simple, single-step pipeline, with GitRepo as the input resource for the step.

In this example:

  • First version: When the pipeline syncs and loads for the first time, the latest commit on the Git repository is seeded as the very first version for the Gitrepo resource.
    When the first run is triggered, this resource version is used for the run.
  • Second version: When a new commit is pushed to the Git repository, the Gitrepo resource is updated with the new version, which becomes the latest version of the resource.
    The Git commit automatically triggers the run and the second version of the resource, which is now the latest, is used for the run.

Creating Resources

All resources are defined in a pipeline YAML under the resources tag, as shown below. After a resource is defined and committed to a source control, it is consumed within a pipeline, based on the scope defined for the pipeline source.

When defining resources, the recommended approach is to define them in a pipeline source that is shared across environments in the same Project. This ensures that the resources are available across environments in a project. For more information, see Creating a Project - Pipelines Resources.

While each resource has its own specific configuration, they all require a name and a type.

resources:
  - name:               <string>
    type:               <resource type name>
    configuration:
      <as required by type>

Tag

Description of usage

Required/Optional

name

An alphanumeric string (underscores are permitted) that makes it easy to infer what the resource represents. This name is used to refer to the resource in steps, and must be unique across all repositories in your JFrog Pipelines Project.

Exampleaws_creds to represent AWS keys.

  • After the pipeline performs a sync, it is recommended to not change the name of the resource. If you change the name of the resource, it is treated as a new resource and all its version history is lost. In addition, the name of the resource will need to be updated in all the places it appears in the pipeline.
  • Within the scope of a pipeline (across a Project or environment), no two resources can share the same name.
Required
type

Name of the resource type that this resource is an instance of.

After the pipeline performs a sync, its type cannot be modified.

Required
configuration

Specifies configuration settings, which vary for each type of resource.

Commonly included in this block is a setting that assigns an integration through which the resource will be authenticated and accessed. The integration must be compatible with the type of the resource. The name of the integration field will vary by the resource. 

Required

Examples - Resource Definition

These examples show the YAML definition for GitRepo and Image resources:

  • Example 1 - GitRepo Resource

    resources:
      - name: gitrepo_trigger
        type: GitRepo
        configuration:
          gitProvider: my_github
          path: myuser/repo-name
          branches:
            include: master
  • Example 2 - Image Resource

    resources: 
      - name: Image_1
        type: Image
        configuration:    
          registry: PSSG_DockerRegistry      
          imageName: docker/jfreq_win        
          imageTag: latest 
          autoPull: true

Modifying Resources

In Pipelines, resources and their versions are tightly coupled. Therefore, when a resource is deleted, its historical data is permanently affected. This can mess up your DevOps Assembly Lines as it is a connected inter-dependent workflow. 

The following rules apply when editing resources:

  • If you modify a resource's name, it is treated as new resource.
  • A resource's type and certain configurations (such as, the path tag in a GitRepo resource) cannot be modified.

If your pipeline is failing because of a modified resource tag, the only option to recover your pipeline is to delete the resource definition. Deletion of a resource is a two-step process:

  1. Soft delete: After deleting the resource definition from the YAML and committing the change to the repo, when Pipelines automatically syncs, it marks the removed resource as a soft delete. All information about the resource is stored in the Pipelines database for one hour after deletion. 
    • Within this one hour, if you create another pipeline source and try to add another resource with the same name as the deleted resource, the sync fails and an error message is generated about the duplicate resource name.
    • In this scenario, you can either define the new resource with another name or wait for an hour and then define a new resource with the same name as the deleted resource.
  2. Hard delete: After one hour, the resource and all its historical data is deleted permanently from the Pipelines database.

Using Resources

In a pipeline, steps can use resources as:

  • Inputs: When a resource is an input for a step, it is called an input resource. The input resource of one step can be the output resource of other steps.
  • Outputs: When a resource is an output of a step, it is called an output resource. The output resource of one step can be the input resource of other steps.

Input and output resources can originate from the same pipeline source as the pipeline or another pipeline source in the same project and environment. 

Input Resources

Input resources enable you to create dependencies between steps and pipelines. Steps that have input resources from other pipelines trigger a run of the pipeline when the resource is updated. By default, input resources that were an output resource of another step in the same run will run whether or not the resource is updated. A resource can also be referred to by its name as an argument in the shell commands that the step executes.

Input Resource Definiton

A resource can be specified as an input for a step by adding it to the inputResources section of a step. 

    steps:
      - name: <step_name>
        type: <step_type>
        configuration:          
		  inputResources:
            - name:         <resource name>
              trigger:      <true/false>    # default true
              newVersionOnly: <true/false>  # default false
              branch:       <string>        # see description of defaults below

Tag

Description of usage

Required/Optional

name

Name of the declared resource that is to be a used as an input for the step.

Required
trigger

Trigger is set as true by default and any change to a named resources triggers the execution of a step. This can be changed by declaring trigger as false.

Optional
newVersionOnly

Setting newVersionOnly as true for an input resource will cause a step to be skipped if the input resource is not updated in the current run.

If there are multiple inputResources with newVersionOnly: true, the step is skipped only when none of the resources are updated. If at least one of the resources is updated, the step is not skipped.

Optional
branch

branch is required only when using a resource from another branch of a multi-branch source. By default, the resource from a single-branch source or the same branch of a multi-branch pipeline is used. To use a multi-branch resource in a single-branch pipeline or another branch in a multi-branch pipeline, branch should be used to specify the branch to which the resource belongs.

Optional
    steps:
      - name: step_1
        type: Bash
        configuration:
          inputResources:
            - name: my_app_repo
              trigger: false   # optional; default true
              newVersionOnly: true  # optional; default false
			  branch: master   # optional 

Using Input Resources

This section provides information about the various ways in which input resources can be used to manipulate pipeline runs.

Skip Automatic Trigger for All Commits

By default, changes to an input resource triggers the execution of the dependent steps. For example, when a step specifies a GitRepo resource, any new code committed to that Git repository automatically causes that step to execute. However, this behavior can be changed by declaring trigger as false (see below). Now, even when the resource is updated, the dependent step is not triggered. This is especially useful for a production pipeline, when you do not want to deploy every new build. 

For a step to not be triggered automatically, triggerfalse must be set for all all input resources in the step.

Even if trigger is set as false, the step still receives webhook updates. This ensures that when it is manually triggered, it uses the latest commit.

pipelines:
  - name: java_pipeline
    steps:
      - name: step_1
        type: Bash
        configuration:
          inputResources:
            - name: my_app_repo
			  trigger: false
   			- name: cron_trigger
     		  trigger: false
        execution:
           onExecute:
             - pushd $res_my_app_repo_resourcePath
             - ./execute.sh
             - popd

When trigger is set as false, the line linking the input resource and the step appears as a dashed line.

trigger: false
trigger: true

Trigger Automatically on New Version Only

Whenever a resource undergoes a change, its version is updated and the dependent step is triggered. This is the default behaviour for all input resources. To skip steps in a run when input resources are not updated, add the  newVersionOnly  tag and set it as true . During a run, the step is triggered only when the resource is updated. If the resource is not updated, the step is skipped and all the downstream steps are skipped as well.

pipelines:
  - name: java_pipeline
    steps:
      - name: step_1
        type: Bash
        configuration:
          inputResources:
            - name: my_app_repo
			- newVersionOnly: true
        execution:
           onExecute:
             - pushd $res_my_app_repo_resourcePath
             - ./execute.sh
             - popd

resources:
  - name: S_WF_012_resource
    type: PropertyBag
    configuration:
      runNumber: 0

pipelines:
  - name: pipeline_S_WF_012_001
    steps:
      - name: S_WF_012_input1
        type: Bash
        configuration:
          outputResources:
            - name: S_WF_012_resource
        execution:
          onExecute:
            #- write_output S_WF_012_resource runNumber=${run_number}
            - echo "test"

      - name: step1      
        type: Bash
        configuration:
          inputResources:
            - name: S_WF_012_resource
              newVersionOnly: true
        execution:
          onExecute:
            - echo "test"

      - name: step2
        type: Bash
        configuration:
          inputResources:
            - name: S_WF_012_resource
        execution:
          onExecute:
            - echo "test"


Trigger Manually using Specific Versions

A run can be customized by selecting a specific version for an input resource. For more information, see Triggering a Run with Custom Parameters.

Pinning Resource Versions

By default, Pipelines uses the most recent or latest version of an input resource when running a job. However, there could be cases where you want to use a specific version of an input resource for a run. This is called pinning and input versions can be pinned using the YAML configuration. When a resource version is unpinned, it switches to using the latest version for all subsequent runs.

Resource version Ids have a global sequence, which can be found on the Resource tab. For more information, see Viewing Resources.

Pinning Resource Versions in YAML

You can use the pin tag to pin a specific input version as shown below below:

resources:
  - name: 			<string>
    type: 			DistributionRule
    configuration:
      pin:
        versionId:  <number>

The following resources support version pinning:

Output Resources

Output resources are resources that are either generated or changed by a step. When specified as the output of a step , the resource receives the output of the step. If required, this output resource can then be used as an input resource in a subsequent step in the same pipeline or another pipeline. The output resource can also be referred to by its name as an argument in the shell commands that the step executes.

Output Resource Definiton

A resource can be specified as an output for a step by adding it in the outputResources section of a step. 

    steps:
      - name: <step_name>
        type: <step_type>
        configuration:          
		  outputResources:
            - name:         <resource name>
              branch:       <string>        # see description of defaults below

Tag

Description of usage

Required/Optional

name

Name of the declared resource that is to be used as an input for the step.

Required
branch

branch is only required for resources from another branch of a multi-branch source. By default, the resource from a single-branch source or the same branch of a multi-branch pipeline is assumed. To update a multi-branch resource in a single-branch pipeline or another branch in a multi-branch pipeline, branch should be used to specify the branch to which the resource belongs.

Optional
    steps:
      - name: step_2
        type: Bash
        configuration:
          outputResources:
            - name: my_repo
			  branch: master

Viewing Resources in the UI

After a pipeline's YAML file is committed to a repository, add the repository to Pipelines through the UI. The Pipelines platform then watches for changes (job additions, edits or deletes) through source control webhooks. YAML changes are automatically synced and are reflected in the UI immediately.

After the pipeline source successfully syncs the YAML file, select Applications | My Pipelines to view the the pipeline.

In the Pipelines view:

  • Each resource is shown as a circular icon
  • Clicking a resource displays information specific to that resource
  • Clicking the YAML  icon → Resources tab displays the resources definition for the pipeline

A resource that automatically triggers a step is represented by a solid line (see below). This changes to a dashed line when the trigger tag for an input resource is set as false. For more information, see Automatic Trigger.

Resetting Resource Versions

If a resource is in an invalid state, you have the option to reset it, which delete all previous versions of the resource. For example, when a source repository invalidates the SHA that is used by a GitRepo resource.

To reset a resource:

  1. In the Pipelines view, click the resource and then click the Reset button.
  2. Click Confirm.


Advanced Usage of Resources

In a pipeline, the role of a resource goes beyond inputs and outputs. Here are several other ways in which you can use resources in your pipeline.

Using Resources Values in Environment Variables

A step that specifies a resource can access the resource and its attributes through environment variables.

  • These environment variables have a standard naming convention: res_<resource name>_<tag>
  • This can be extended further to access an integration that is specified in the resource: res_<resource_name>_<integration tag name>_<tag>

Example

  • If the definition of a GitRepo resource named app_gitrepo is as follows:

    resources:
      - name: app_gitRepo
        type: GitRepo
        configuration:
          path: user1/repo1
          gitProvider: myGitProvider
  • The environment variable definition would be as follows:

    Environment Variable

    Result

    Description

    res_app_gitRepo_pathuser1/repo1Returns the path attribute of the app_gitRepo resource
    res_app_gitRepo_gitProvider_url
    Returns url of the Git Provider of the app_gitRepo resource
  • Then the following environment variables will be available to a step that uses the app_gitRepo resource as an input:

    steps:
      - name: build_app
        type: MvnBuild
        configuration:
          sourceLocation: .
          mvnCommand: clean install
          configFileLocation: .
          inputResources:
            - name: app_gitRepo             # Use the app_gitRepo resource
       execution:
         onSuccess:
           - send_notification notifySlack --text "Maven build completed for $res_app_gitRepo_path at $res_app_gitRepo_gitProvider_url"
         onFailure:
           - send_notification notifySlack --text "Maven build FAILED for $res_app_gitRepo_path at $res_app_gitRepo_gitProvider_url"

Using Arrays

Certain resources, such as Aql, DistributionRule, and VmCluster have arrays in their configuration.

res_<resource name>_<array heading>_len     (this tells you how many entries in array)
res_<resource name>_<array heading>_0
res_<resource name>_<array heading>_1
res_<resource name>_<array heading>_2

res_<resource name>_<array heading>_len     (this tells you how many entries in array)
res_<resource name>_<array heading>_0_<tag>
res_<resource name>_<array heading>_1_<tag>

Using Stateful Resources

Resources are stateful entities and persist across pipelines, enabling passing of information between pipelines. This is especially useful when creating a pipeline of pipelines.

Steps can store any key-value pair data in a resource using the write_output utility function. These values can then be referenced as environment variables by any subsequent cuting step that uses that resource as an input. Therefore, a step can pass information to another step in the run of the pipeline.

The environment variable for the stored value is of the form res_<resource name>_<key name> .  

Example

The following example creates three properties in the resource myImage.

write_output myImage sport="baseball" equipment="bat" field="diamond"

When the resource is specified in a step's inputResources , these properties can be accessed as the following environment variables:

$ printenv res_myImage_baseball
baseball
$ printenv res_myImage_equipment
bat
$ printenv res_myImage_field
diamond

For more information, see Creating Stateful Pipelines.

Using Extension Resources 

Extension resources enable Pipelines users to extend the Pipelines DSL by specifying their own resource types. After an extension resource is loaded, it can be used by any step in a pipeline. For more information, see Pipelines Extension Resource Model.