Overview

Pipeline jobs simplify building continuous delivery workflows with Jenkins by creating a script that defines the steps of your build. For those not familiar with Jenkins Pipeline, please refer to the Pipeline Tutorial or the Getting Started With Pipeline documentation.

The Jenkins Artifactory Plugin supports Artifactory operations pipeline APIs. You have the added option of downloading dependencies, uploading artifacts, and publishing build-info to Artifactory from a pipeline script.

This page describes how to use declarative pipeline syntax with Artifactory. Declarative syntax is available from version 3.0.0 of the Jenkins Artifactory Plugin.


Scripted syntax is also supported. Read more about it here.

Examples

The Jenkins Pipeline Examples can help get you started creating your pipeline jobs with Artifactory.


<a href="https://jfrog.com/integration/jenkins-ci/ " target="blank"> JFrog Artifactory and Jenkins-CI </a>

Creating an Artifactory Server Instance

There are two ways to tell the pipeline script which Artifactory server to use. You can either define the server details as part of the pipeline script, or define the server details in Manage | Configure System.

If you choose to define the Artifactory server in the pipeline, add the following to the script:

rtServer (
    id: "Artifactory-1",
    url: "http://my-artifactory-domain/artifactory",
	// If you're using username and password:
    username: "user",
    password: "password"
	// If you're using Credentials ID:
	credentialsId: 'ccrreeddeennttiiaall'
	// If Jenkins is configured to use an http proxy, you can bypass the proxy when using this Artifactory server:
	bypassProxy: true
	// Configure the connection timeout (in seconds).
	// The default value (if not configured) is 300 seconds: 
	timeout = 300
)


You can also use a Jenkins Credential ID instead of the username and password:

The id property (Artifactory-1 in the above examples) is a unique identifier for this server, allowing us to reference this server later in the script. If you prefer to define the server in Manage | Configure System, it is referenced using its configured Server ID.


Uploading and Downloading Files

To download the files, add the following closure to the pipeline script:

rtDownload (
    serverId: "Artifactory-1",
    spec:
        """{
          "files": [
            {
              "pattern": "bazinga-repo/froggy-files/",
              "target": "bazinga/,
            }
         ]
        }"""
)

In the above example, file are downloaded from the Artifactory server referenced by the Artifactory-1 server ID.

The above closure also includes a File Spec, which specifies the files which files should be downloaded. In this example, all ZIP files in the bazinga-repo/froggy-files/ Artifactory path should be downloaded into the bazinga directory on your Jenkins agent file system.

Uploading files is very similar. The following example uploads all ZIP files which include froggy in their names into the froggy-files folder in the bazinga-repo Artifactory repository.

rtUpload (
    serverId: "Artifactory-1",
    spec:
        """{
          "files": [
            {
              "pattern": "bazinga/*froggy*.zip",
              "target": "bazinga-repo/froggy-files/"
            }
         ]
        }"""
)

You can manage the File Spec in separate files, instead of adding it as part of the rtUpload and rtDownload closures. This allows managing the File Specs in a source control, possible with the project sources. Here's how you access the File Spec i the rtUploadThe configuration is similar in the rtDownload closure:

rtUpload (
    serverId: "Artifactory-1",
    specPath: 'path/to/spec/relative/to/workspace/spec.json'
)

You can read about using File Specs for downloading and uploading files here.

If you'd like to fail the build in case no files are uploaded or downloaded, add the failNoOp property to the rtUpload or rtDownload closures as follows:

rtUpload (
    serverId: "Artifactory-1",
    specPath: 'path/to/spec/relative/to/workspace/spec.json',
    failNoOp: true
)

Setting and Deleting Properties on Files in Artifactory

When uploading files to Artifactory using the rtUpload closure, you have the option of setting properties on the files. These properties can be later used to filter and download those files.

In some cases, you may want want to set properties on files that are already in Artifactory. The way to this is very similar to the way you define which files to download or upload: A FileSpec is used to filter the filter on which the properties should be set. The properties to be set are sent outside the File Spec. Here's an example.

rtSetProps (
	serverId: "Artifactory-1",
	specPath: 'path/to/spec/relative/to/workspace/spec.json',
	props: 'p1=v1;p2=v2',
	failNoOp: true
)

In the above example:

  1. The serverId property is used to reference pre-configured Artifactory server instance as described in the Creating Artifactory Server Instance section. 
  2. The specPath property include a path to a File Spec, which has a similar structure to the File Spec used for downloading files.
  3. The props property defines the properties we'd like to set. In the above example we're setting two properties - p1 and p2 with the v1 and v2 values respectively.
  4. The failNoOp property is optional. Setting it to true will cause the job to fail, if no properties have been set.

You also have the option of specifying the File Spec directly inside the rtSetProps closure as follows.

rtSetProps (
	serverId: "Artifactory-1",
	props: 'p1=v1;p2=v2',      
	spec: """{
	"files": [{
		"pattern": "my-froggy-local-repo",
		"props": "filter-by-this-prop=yes"
	}]}"""
)

The rtDeleteProps closure is used to delete properties from files in Artifactory, The syntax is pretty similar to the rtSetProps closure. The only difference is that in the rtDeleteProps, we specify only the names of the properties to delete. The names are comma separated. The properties values should not be specified. Here's an example:

rtDeleteProps (
	serverId: "Artifactory-1",
	specPath: 'path/to/spec/relative/to/workspace/spec.json',
	props: 'p1,p2,p3',
	failNoOp: true
)

Similarly to the rtSetProps closure, the File Spec can be defined implicitly in inside the closure as shown here:

rtDeleteProps (
	serverId: "Artifactory-1",
	props: 'p1,p2,p3',      
	spec: """{
	"files": [{
		"pattern": "my-froggy-local-repo",
		"props": "filter-by-this-prop=yes"
	}]}"""
)



Publishing Build-Info to Artifactory

Files which are downloaded by the rtDownload closure are automatically registered as the current build's dependencies, while files that are uploaded by the rtUpload closure are registered as the build artifacrts. The depedencies and artifacts are recorded locally and can be later published as build-info to Artifactory.

Here's how you publish the build-info to Artifactory:

rtPublishBuildInfo (
    serverId: "Artifactory-1"
)

You have the option of setting different build name and build number for the published build info. Here's how you do it:

rtPublishBuildInfo (
    serverId: "Artifactory-1",
    buildName: 'holyFrog',
    buildNumber: '42'
)

If you set a custom build name and number as shown above, please make sure to set the same build name and number in the rtUpload or rtDownload closures as shown below. If you don't, Artifactory will not be able to associate these files with the build and therefore their files will not be displayed in Artifactory.

rtUpload (
    serverId: "Artifactory-1",
    buildName: 'holyFrog',
    buildNumber: '42',
    specPath: 'path/to/spec/relative/to/workspace/spec.json'
)

Capturing Environment Variables

To set the Build-Info object to automatically capture environment variables while downloading and uploading files, add the following to your script.

    rtBuildInfo (
        captureEnv: true,
        
		// Optional - Build name and build number. If not set, the Jenkins job's build name and build number are used.
		buildName: 'my-build',
		buildNumber: '20'
    )

By default, environment variables names which include "password", "psw", "secret", or "key" (case insensitive) are excluded and will not be published to Artifactory.

You can add more include/exclude patterns with wildcards as follows:

    rtBuildInfo (
        captureEnv: true,
        includeEnvPatterns: ["*abc*", "*bcd*"],
        excludeEnvPatterns: ["*private*", "internal-*"]
    )



Collecting Build Issues

The build-info can include the issues which were handled as part of the build. The list of issues is automatically collected by Jenkins from the git commit messages. This requires the project developers to use a consistent commit message format, which includes the issue ID and issue summary, for example:
HAP-1364 - Replace tabs with spaces
The list of issues can be then viewed in the Builds UI in Artifactory, along with a link to the issue in the issues tracking system.
The information required for collecting the issues is provided through a JSON configuration. This configuration can be provided as a file or as a JSON string.
Here's an example for issues collection configuration.

{
    "version": 1,
    "issues": {
        "trackerName": "JIRA",
        "regexp": "(.+-[0-9]+)\\s-\\s(.+)",
        "keyGroupIndex": 1,
        "summaryGroupIndex": 2,
        "trackerUrl": "http://my-jira.com/issues",
        "aggregate": "true",
        "aggregationStatus": "RELEASED"
    }
}

Configuration file properties:

Property nameDescription
VersionThe schema version is intended for internal use. Do not change!
trackerNameThe name (type) of the issue tracking system. For example, JIRA. This property can take any value.
trackerUrlThe issue tracking URL. This value is used for constructing a direct link to the issues in the Artifactory build UI.
keyGroupIndex

The capturing group index in the regular expression used for retrieving the issue key. In the example above, setting the index to "1" retrieves HAP-1364 from this commit message:

HAP-1364 - Replace tabs with spaces

summaryGroupIndex

The capturing group index in the regular expression for retrieving the issue summary. In the example above, setting the index to "2" retrieves the sample issue from this commit message:

HAP-1364 - Replace tabs with spaces

aggregate

Set to true, if you wish all builds to include issues from previous builds.

aggregationStatus

If aggregate is set to true, this property indicates how far in time should the issues be aggregated. In the above example, issues will be aggregated from previous builds, until a build with a RELEASE status is found. Build statuses are set when a build is promoted using the jfrog rt build-promote command.
regexp

A regular expression used for matching the git commit messages. The expression should include two capturing groups - for the issue key (ID) and the issue summary. In the example above, the regular expression matches the commit messages as displayed in the following example:

HAP-1364 - Replace tabs with spaces

Here's how you set issues collection in the pipeline script.

rtCollectIssues (
    serverId: "Artifactory-1",
    config: """{
        "version": 1,
        "issues": {
            "trackerName": "JIRA",
            "regexp": "(.+-[0-9]+)\\s-\\s(.+)",
            "keyGroupIndex": 1,
            "summaryGroupIndex": 2,
            "trackerUrl": "http://my-jira.com/issues",
            "aggregate": "true",
            "aggregationStatus": "RELEASED"
        }
    }""",
)

In the above example, the issues config is embedded inside the rtCollectIssues closure. You also have the option of providing a file which includes the issues configuration. Here's how you do this:

rtCollectIssues (
    serverId: "Artifactory-1",
    configPath: '/path/to/config.json'
)

If you'd like add the issues information to a specific build-info, you can also provide build name and build number as follows:

rtCollectIssues (
    serverId: "Artifactory-1",
    configPath: '/path/to/config'
    buildName: 'my-build',
    buildNumber: '20'
)

To help you get started, we recommend using the Github Examples.


Triggering Build Retention

Build retention can be triggered when publishing build-info to Artifactory using the rtPublishBuildInfo closure. To setting build retention though should be done before publishing the build, by using the rtBuildInfo closure, as shown below.

    rtBuildInfo (
        // Optional - Maximum builds to keep in Artifactory.
        maxBuilds: 1,
		// Optional - Maximum days to keep the builds in Artifactory.
        maxDays: 2,
		// Optional - List of build numbers to keep in Artifactory.
        doNotDiscardBuilds: ["3"],
		// Optional (the default is false) - Also delete the build artifacts when deleting a build.
        deleteBuildArtifacts: true,
        
		// Optional - Build name and build number. If not set, the Jenkins job's build name and build number are used.
		buildName: 'my-build',
		buildNumber: '20'
    )



Promoting Builds in Artifactory

To promote a build between repositories in Artifactory, define the promotion parameters in the rtPromote closure For example:

    rtPromote (
        // Mandatory parameter

        buildName: 'MK',
        buildNumber: '48',
		// Artifactory server ID from Jenkins configuration, or from configuration in the pipeline script
        serverId: ״Artifactory-1״,
		// Name of target repository in Artifactory 
        targetRepo: 'libs-release-local',

        // Optional parameters

		// Comment and Status to be displayed in the Build History tab in Artifactory
        comment: 'this is the promotion comment',
		status: 'Released',
		// Specifies the source repository for build artifacts. 
        sourceRepo: 'libs-snapshot-local',
        // Indicates whether to promote the build dependencies, in addition to the artifacts. False by default.
        includeDependencies: true,
		// Indicates whether to fail the promotion process in case of failing to move or copy one of the files. False by default
        failFast: true,
		// Indicates whether to copy the files. Move is the default.
        copy: true
    )



Allowing Interactive Promotion for Published Builds

The Promoting Builds in Artifactory section describes how your Pipeline script can promote builds in Artifactory. In some cases however, you'd like the build promotion to be performed after the build finished. You can configure your Pipeline job to expose some or all the builds it publishes to Artifactory, so that they can be later promoted interactively using a GUI. Here's how the Interactive Promotions looks like:


When the build finishes, the promotion window will be accessible by clicking on the promotion icon, next to the build run. To enable interactive promotion for a published build, add the rtAddInteractivePromotion as shown below.

    rtAddInteractivePromotion (
		// Mandatory parameters


		// Artifactory server ID from Jenkins configuration, or from configuration in the pipeline script
        serverId: ״Artifactory-1״,
        buildName: 'MK',
        buildNumber: '48',

        // Optional parameters
		
		If set, the promotion window will display this label instead of the build name and number.
        displayName: 'Promote me please',
		// Name of target repository in Artifactory 
        targetRepo: 'libs-release-local
		// Comment and Status to be displayed in the Build History tab in Artifactory
        comment: 'this is the promotion comment',
		status: 'Released',
		// Specifies the source repository for build artifacts. 
        sourceRepo: 'libs-snapshot-local',
        // Indicates whether to promote the build dependencies, in addition to the artifacts. False by default.
        includeDependencies: true,
		// Indicates whether to fail the promotion process in case of failing to move or copy one of the files. False by default
        failFast: true,
		// Indicates whether to copy the files. Move is the default.
        copy: true
    )

You can add multiple rtAddInteractivePromotion closures, to include multiple builds in the promotion window.


Maven Builds with Artifactory

Maven builds can resolve dependencies, deploy artifacts and publish build-info to Artifactory. To run Maven builds with Artifactory from your Pipeline script, you first need to create an Artifactory server instance, as described in the Creating an Artifactory Server Instance section.

The next step is to define an rtMavenResolver closure, which defines the dependencies resolution details, and an rtMavenDeployer closure, which defines the artifacts deployment details. Here's an example:

rtMavenResolver (
    id: 'resolver-unique-id',
    serverId: 'Artifactory-1',
    releaseRepo: 'libs-release",
    snapshotRepo: 'libs-snapshot"
)   

rtMavenDeployer (
    id: 'deployer-unique-id',
    serverId: 'Artifactory-1',
    releaseRepo: 'libs-release-local',
    snapshotRepo: "libs-snapshot-local"
)

As you can see in the example above, the resolver and deployer should have a unique ID, so that they can be referenced later in the script, In addition, they include an Artifactory server ID and the names of release and snapshot maven repositories.

Now we can run the maven build, referencing the resolver and deployer we defined:

rtMavenRun (
	// Tool name from Jenkins configuration.
    tool: MAVEN_TOOL, 
    pom: 'maven-example/pom.xml',
    goals: 'clean install',
	// Maven options.
	opts: '-Xms1024m -Xmx4096m',
    resolverId: 'resolver-unique-id'
    deployerId: 'deployer-unique-id',
)

Instead of setting the tool in the rtMavenRun closure, you can set the path to the Maven installation directory using the MAVEN_HOME environment variable as follows:

    environment {
        MAVEN_HOME = '/tools/apache-maven-3.3.9'
    }

In case you'd like Maven to use a different JDK than your build agent's default, no problem.
Simply set the JAVA_HOME environment variable to the desired JDK path (the path to the directory above the bin directory, which includes the java executable).

    environment {
        JAVA_HOME = '/full/path/to/JDK'
    }



The last thing you might want to do, is to publish the build-info for this build. See the Publishing Build Info to Artifactory section for how to do it.


Gradle Builds with Artifactory

Gradle builds can resolve dependencies, deploy artifacts and publish build-info to Artifactory. To run Gradle builds with Artifactory from your Pipeline script, you first need to create an Artifactory server instance, as described in the Creating an Artifactory Server Instance section.

The next step is to define an rtGradleResolver closure, which defines the dependencies resolution details, and an rtGradleDeployer closure, which defines the artifacts deployment details. Here's an example:

rtGradleResolver (
    id: "resolver-unique-id",
    serverId: "Artifactory-1",
    repo: "jcenter-remote"
)
     
rtGradleDeployer (
    id: "deployer-unique-id",
    serverId: "Artifactory-1",
    repo: "libs-snapshot-local",
)

As you can see in the example above, the resolver and deployer should have a unique ID, so that they can be referenced later in the script, In addition, they include an Artifactory server ID and the names of release and snapshot maven repositories.

Now we can run the Gradle build, referencing the resolver and deployer we defined:

rtGradleRun (
	// Set to true if the Artifactory Plugin is already defined in build script.
    usesPlugin: true,
	// Tool name from Jenkins configuration. 
    tool: GRADLE_TOOL,
	// Set to true if you'd like the build to use the Gradle Wrapper.
	useWrapper: true
    rootDir: "gradle-examples/gradle-example/",
    buildFile: 'build.gradle',
    tasks: 'clean artifactoryPublish',
    resolverId: "resolver-unique-id",
    deployerId: "deployer-unique-id",
)

In case you'd like Gradle to use a different JDK than your build agent's default, no problem.
Simply set the JAVA_HOME environment variable to the desired JDK path (the path to the directory above the bin directory, which includes the java executable).
Here's you do it:

    environment {
        JAVA_HOME = '/full/path/to/JDK'
    }


The last thing you might want to do, is to publish the build-info for this build. See the Publishing Build Info to Artifactory section for how to do it.

You also have the option of defining default values in the gradle build script. Read more about it here.



NPM Builds with Artifactory

NPM builds can resolve dependencies, deploy artifacts and publish build-info to Artifactory. To run NPM builds with Artifactory from your Pipeline script, you first need to create an Artifactory server instance, as described in the Creating an Artifactory Server Instance section.

The next step is to define an rtNpmResolver closure, which defines the dependencies resolution details, and an rtNpmDeployer closure, which defines the artifacts deployment details. Here's an example:

rtNpmResolver (
    id: "resolver-unique-id",
    serverId: "Artifactory-1",
    repo: "libs-npm"
)
      
rtNpmDeployer (
    id: "deployer-unique-id",
    serverId: "Artifactory-1",
    repo: "libs-npm-local",
)


As you can see in the example above, the resolver and deployer should have a unique ID, so that they can be referenced later in the script, In addition, they include an Artifactory server ID and the names of release and snapshot maven repositories.

Now we can use the rtNpmInstall closure, to resolve the NPM dependencies. Notice that the closure references the resolver we defined above.

                rtNpmInstall (
					// Optional tool name from Jenkins configuration
                    tool: NPM_TOOL,
					// Optional path to the project root. If not set, the root of the workspace is assumed as the root project path.
                    path: "npm-example",
					// Optional npm flags or arguments.
					args: '--verbose',
                    resolverId: "resolver-unique-id"

                )

And to pack and publish the npm package out project creates, we use the rtNpmPublish with a reference to the deployer we defined.

	rtNpmPublish (
		// Optional tool name from Jenkins configuration
		tool: 'npm-tool-name',
		// Optional path to the project root. If not set, the root of the workspace is assumed as the root project path.
		path: "npm-example",
		deployerId: "deployer-unique-id"
	)

The build uses the npm executable to install (download the dependencies) and also to pack the resulting npm package before publishing it. By default, Jenkins uses the npm executable, which is present in the agent’s PATH. You can also reference a tool defined in Jenkins configuration. Here's how:

	environment {
		// Path to the NodeJS home directory (not to the npm executable)
		NODEJS_HOME = 'path/to/the/nodeJS/home'
    }
	// or
	environment {
		// If a tool named 'nodejs-tool-name' is defined in Jenkins configuration.
		NODEJS_HOME = "${tool 'nodejs-tool-name'}"
    }
	// or
	nodejs(nodeJSInstallationName: 'nodejs-tool-name') {
    	// Only in this code scope, the npm defined by 'nodejs-tool-name' is used.
	}

If the npm installation is not set, the npm executable which is found in the agent's PATH is used.



Docker Builds with Artifactory

The Jenkins Artifactory Plugin supports (Since version 3.0.0) a Pipeline DSL that allows collecting and publishing build-info to Artifactory for your Docker builds. To setup your Jenkins build agents to collect build-info for your Docker builds, please refer to the setup instructions.

rtServer (
    id: "Artifactory-1",
    url: "http://my-artifactory-domain/artifactory",
    credentialsId: "my-credentials-id"
)

rtDockerPush(
    serverId: "Artifactory-1",
    image: ARTIFACTORY_DOCKER_REGISTRY + '/hello-world:latest',
    // Host:
    // On OSX: "tcp://127.0.0.1:1234"
    // On Linux can be omitted or null
    host: HOST_NAME,
    targetRepo: 'docker-local',
    // Attach custom properties to the published artifacts:
    properties: 'project-name=docker1;status=stable'
)

rtPublishBuildInfo (
    serverId: "Artifactory-1"
)



Scanning Builds with JFrog Xray

The Jenkins Artifactory Plugin is integrated with JFrog Xray through JFrog Artifactory allowing you to have build artifacts scanned for vulnerabilities and other issues. If issues or vulnerabilities are found, you may choose to fail a build. This integration requires JFrog Artifactory v4.16 and above and JFrog Xray v1.6 and above. 

You may scan any build that has been published to Artifactory. It does not matter when the build was published, as long as it was published before triggering the scan by JFrog Xray.

The following instructions show you how to configure your Pipeline script to have a build scanned.

rtServer (
    id: "Artifactory-1",
    url: "http://my-artifactory-domain/artifactory",
    credentialsId: "my-credentials-id"
)

xrayScan (
    serverId: "Artifactory-1",
	// If the build name and build number are not set here, the current job name and number will be used:
	buildName: "my-build-name",
	buildNumber: "17",    
	// If the build is found vulnerable, the job will fail by default. If you do not wish it to fail:
	failBuild: false
)