Cloud customer?
 Upgrade in MyJFrog >

Search





Overview

This example demonstrates how simple pipelines can be defined and executed with JFrog Pipelines. An example Pipelines DSL is used to show how to use integrations, resources, and steps to construct a simple, automated workflow.This pipeline example demonstrates the following:

  • Creating a Git integration.
  • Adding a Pipeline Source.
  • Creating a GitRepo trigger, which will trigger a step when the contents of the source control repository change.
  • Using inputResources and inputSteps to set up dependencies between steps and resources.
  • Using environment variables (e.g. $res_myFirstRepo_commitSha) to extract information from inputResources.
  • Using run state to pass information to downstream steps of a run.
  • Using pipeline state to pass information to subsequent runs.
  • Connecting dependent pipelines through resources
Page Contents

Successful runs of the pipeline in this quickstart look like this:



Before you Begin

Before trying this quickstart, ensure that you have:

  • A GitHub account. This is required for forking the example repository.
  • A JFrog Platform Cloud account, or self-hosted JFrog Pipelines.
  • At least one node pool. This is the set of nodes that all pipeline steps will execute in. For more information, see Managing Pipelines Node Pools.
  • A user account in Artifactory with deploy permissions to at least one binary repository.

Running this pipeline

Perform the following steps to run this pipeline:

  1. Fork repository

    The Pipelines DSL for this example is available in the  JFrog GitHub repository repository in the JFrog GitHub account.

    The DSL file is a yaml file that contains the pipeline definitions. This example uses two YAML files:
    • jfrog-pipelines-hello-world.yml, which contains the declarations for the pipelines in this example 
    • values.yml, which contains the values required for the jfrog-pipelines-hello-world.yml file.

    For a full breakup of all the resources, pipelines and steps used in the yml file, see the jfrog-pipelines-hello-world.yml section below.

    Fork this repository to your account or organization. This is important since you need admin access to repositories that are used as Pipeline Sources or GitRepo resources, in order to add webhooks to these repositories and listen for change events.

  2. Sign in


    Sign in to JFrog Platform with your Artifactory credentials.

  3. Add integration


    Go to Administration | PipelinesIntegrations to add one integration:
    GitHub Integration: This integration is used to add the Pipeline source, as well as the GitRepo resource defined in values.yml, to connect Github to Pipelines. Write down the GitHub integration name.

  4. Update values.yml


    The pipelines configuration is available in the values.yml file. Edit this file in your fork of this repo and replace the following:

    Tag

    Description

    Example

    gitProviderProvide the name of the Github integration you added in the previous step. gitProvider: my_github
    pathProvide the path to your fork of this repository.path: myuser/jfrog-pipelines-hello-world

    All pipeline and resource names are global across your JFrog Pipelines project. The names of your steps and resources need to be unique within the JFrog Pipelines project.

  5. Add pipeline source

    The Pipeline Source represents the git repository where our pipelines definition files are stored. A pipeline source connects to the repository through an  integration , which we added in step 3.  

    In your left navigation bar, go to Administration | Pipeline | Pipeline Sources. Click on Add a Pipeline Source and then choose From YAML. Follow instructions to  add a Pipeline Source  This automatically adds your configuration to the platform and pipelines are created based on your YAML.  

    An example of adding a pipeline source is shown below. Please ensure that the Repository Full Name points to your forked repository and the Pipeline Config File Filter is entered correctly as (jfrog-pipelines-hello-world|values).yml so that both your config files are included. 


    After your pipeline source syncs successfully, navigate to Pipelines | My Pipelines in the left navbar to view the newly added pipeline. In this example, my_first_pipeline and  my_second_pipeline are the names of our pipeline.



    Click the name of the pipeline. This renders a real-time, interactive, diagram of the pipeline and the results of its most current run.

  6. Execute the pipeline


    You can now commit to the repo to trigger your pipeline, or trigger it manually through the UI. The steps in the pipeline execute in sequence.


    Once the pipeline has completed, a new run is listed. 


    Successful run of the first pipeline triggers the execution of the second pipeline:



jfrog-pipelines-hello-world.yml

The jfrog-pipelines-hello-world.yml file is made up of resources, pipelines and steps, as shown below:

Resources

This example uses the following types of resources:

GitRepo

GitRepo resource is used to connect JFrog Pipelines to a source control repository. Adding it creates a webhook to the repo so that future commits will automatically create a new version with the webhook payload.

Resources
  - name: myFirstRepo
    type: GitRepo
    configuration:
      # SCM integration where the repository is located
      gitProvider: {{ Values.myRepo.gitProvider }}
      # Repository path, including org name/repo name
      path: {{ Values.myRepo.path }} #manishas-jfrog/jfrog-pipelines-hello-world    # replace with your repository name
      branches:
        # Specifies which branches will trigger dependent steps
        include: master

Tag

Description

Required/Optional

name

myFirstRepo is the name of the GitRepo resource pointing to the repository containing the yaml files and other source code required to build the image. 

This name is used to refer to the resource in steps, and must be unique across all repositories in your JFrog Pipelines environment.

Required

gitProvider 

The name of the GitHub integration. Its value is retrieved from the values.yml file.Required
path The path of the repository from the integration root. Its value is retrieved from the values.yml file.Required
branches
  • include -- (optional) Regular expression to include branches from the repo
  • exclude -- (optional) Regular expression to exclude branches from the repo 

The include: master tag indicates that the GitRepo resource is listening to the master branch.

Optional

Defining a GitRepo resource acts as the trigger for the pipeline.

PropertyBag

PropertyBag resource is used to pass information from one pipeline to another, and to provide environment variables to a step in the format of a resource.

A PropertyBag resource can have any strings as properties, which will then be available as environment variables when the key is an input to a step. When it is an output, steps can change the values of properties or add new ones.

Resources
  - name: myPropertyBag
    type: PropertyBag
    configuration:
      commitSha: 1
      runID: 1          

Tag

Description

Required/Optional
name

myPropertyBag is the name of the PropertyBag resource, which is the metadata associated with the build in Artifactory.

Required

<string>

A property for the PropertyBag resource. The tag should be a valid variable name (Bash or PowerShell) for the steps where it is to be an input or output and the value a string. At least one is required, multiple properties are allowed.

Required

Pipelines

This example uses two pipelines:

  • my_first_pipeline is the name of the first pipeline, consisting of 3 linear steps. The last step outputs a resource of type PropertyBag.
  • my_second_pipeline is the name of the second pipelines, which contains a single step triggered by the PropertyBag resource updated by the first pipeline

Steps

Both my_first_pipeline and my_second_pipeline pipelines contain the following step type:

Bash

Bash is a generic step type that enables executing any shell command. This general-purpose step can be used to execute any action that can be scripted, even with tools and services that haven't been integrated with JFrog Pipelines. This is the most versatile of the steps while taking full advantage of what the lifecycle offers.

In our example:

  • p1_s1, p1_s2, p1_s3 are the names of the Bash steps in the my_first_pipeline pipeline.
  • p2_s1 is the name of the Bash step in the my_second_pipeline pipeline.

The steps are defined so that they execute in an interdependent sequence. This means that each step's execution is triggered by the successful completion of a prior, prerequisite step (or steps).  In our example, step 1's (p1_s1) completion triggers the execution of step 2 (p1_s2), completion of step 2 triggers execution of step 3 (p1_s3), and so on until all steps in the pipeline are executed.

Steps
pipelines:
  - name: my_first_pipeline
    steps:
      - name: p1_s1
        type: Bash
        configuration:
          inputResources:
            # Sets up step to be triggered when there are commit events to myFirstRepo
            - name: myFirstRepo
        execution:
          onExecute:
            # Data from input resources is available as env variables in the step
            - echo $res_myFirstRepo_commitSha
            # The next two commands add variables to run state, which is available to all downstream steps in this run
            # Run state documentation: https://www.jfrog.com/confluence/display/JFROG/Creating+Stateful+Pipelines#CreatingStatefulPipelines-RunState
            - add_run_variables current_runid=$run_id
            - add_run_variables commitSha=$res_myFirstRepo_commitSha
            # This variable is written to pipeline state in p1_s3.
            # So this will be empty during first run and will be set to prior run number in subsequent runs
            # Pipeline state documentation: https://www.jfrog.com/confluence/display/JFROG/Creating+Stateful+Pipelines#CreatingStatefulPipelines-PipelineState
            - echo "Previous run ID is $prev_runid"

      - name: p1_s2
        type: Bash
        configuration:
          inputSteps:
            - name: p1_s1
        execution:
          onExecute:
            # Demonstrates the availability of an env variable written to run state during p1_s1
            - echo $current_runid

      - name: p1_s3
        type: Bash
        configuration:
          inputSteps:
            - name: p1_s2
          outputResources:
            - name: myPropertyBag
        execution:
          onExecute:
            - echo $current_runid
            # Writes current run number to pipeline state
            - add_pipeline_variables prev_runid=$run_id
            # Uses an utility function to update the output resource with the commitSha that triggered this run
            # Dependent pipelines can be configured to trigger when this resource is updated
            # Utility functions documentation: https://www.jfrog.com/confluence/display/JFROG/Pipelines+Utility+Functions
            - write_output myPropertyBag commitSha=$commitSha runID=$current_runid

  - name: my_second_pipeline
    steps:
      - name: p2_s1
        type: Bash
        configuration:
          inputResources:
            # Sets up step to be triggered when myPropertyBag is updated
            - name: myPropertyBag
        execution:
          onExecute:
            # Retrieves the commitSha from input resource
            - echo "CommitSha is $res_myPropertyBag_commitSha"         

configuration

Specifies all optional configuration selections for the step's execution environment.

Tag

Description of usage

Required/Optional

inputResources

A collection of named resources that will be used by a step as inputs. 

In this example:

  • Step p1_s1, in the first pipeline, is triggered when there are commit events to myFirstRepo, which is the name of the GitRepo resource.
  • Step p2_s1, in the second pipeline, is triggered when the myPropertyBag resource is updated.
Optional
inputSteps

A collection of named steps whose completion will trigger execution of this step.

In this example:

  • Completion of step p1_s1 triggers the execution of step p1_s2.
    and
  • Completion of step p1_s2 triggers the execution of step p1_s3.
Optional
outputResources

A collection of named resources that will be generated or changed by a step.

Optional

execution

Declare sets of shell command sequences to perform for different execution phases:

Tag

Description of usage

Required/Optional

onExecuteMain commands to execute for the stepOptional



Copyright © 2021 JFrog Ltd.