Before you Begin
Before trying this example, ensure that you have:
- A GitHub account. This is required for forking the sample repository.
- A JFrog Platform account, or self-hosted JFrog Pipelines.
- At least one node pool. This is the set of nodes that all pipeline steps will execute in. For more information, see Managing Pipelines Node Pools. Please note that if you have a Cloud account, a node pool will already be available as part of your subscription.
- It is recommended that you run
go mod tidycommand on your local machine to produce the
go.sumfile and then push it to your Git repo.
Running This Example
Please follow the steps below to build your Go binary:
Fork the repositoryThis Pipelines sample is available in the jfrog-pipelines-go-sample repository in the JFrog GitHub account. The configuration is included in YAML files at the root of the repository:
pipelines.yml, which contains the declarations for all the resources and steps required to run the pipeline. This configuration is written in template format, so you will not need to change anything in this file.
values.yml, which contains custom values that will be populated into the template to create your pipeline
Fork this repository to your account or organization. This is important since you need admin access to repositories that are used as in your pipelines, in order to enable us to add webhooks to these repositories and listen for change events.
The Git repository includes a
go.sumfile. However, it is recommended that you run
go mod tidycommand on your local machine to produce a new
go.sumfile and then push it to your Git repo.
Create the required Go repositoriesYou will need to create the following repositories which will be used in your pipeline configuration:
- go-local: A local Go repository where your binary will be published
- go-remote: A remote Go repository that proxies https://proxy.golang.org/
- go-virtual: A virtual Go repository that aggregates local and remote repositories and is used in your pipeline definition to resolve dependencies. Please ensure that you select your local and remote repositories in the Repositories section while creating this virtual repo.
Add Integrationsa. Go to Administration | Pipelines | Integrations to add two integrations:
- GitHub Integration: This integration is used to add the Pipeline source, as well as the GitRepo resource.
- Artifactory Integration: This integration is used to authenticate with Artifactory to resolve depedencies and to publish the built binary to Artifactory.
Update pipeline definitions
pipelines.ymlconfig file is templatized, you can just update
values.ymlin your forked repository by following instructions below.
Tag Description Example
Provide the name of the Github integration you added in the previous step (4).
Provide the path to your fork of this repository.
Provide the name of the Artifactory integration you added in the previous step (4).
And that's it. Your configuration is ready to go!
All pipeline definitions are global across JFrog Pipelines within a Project. The names of your pipelines and resources need to be unique within the Project in JFrog Pipelines.
Add Pipeline SourcesA Pipeline Source represents the git repository where our pipelines definition files are stored. A pipeline source connects to the repository through an integration, which we added in step 4.
- In your left navigation bar, go to Administration | Pipelines | Pipeline Sources. Click Add a Pipeline Source and then choose From YAML. Follow instructions to add a Pipeline Source. This automatically adds your configuration to the platform and pipelines are created based on your YAML.
- After your pipeline source syncs successfully, navigate to Pipelines | My Pipelines in the left navbar to view the newly added pipeline. In this example,
go_build_pipeline_exampleis the names of your pipeline.
- Click the name of the pipeline. This renders a real-time, interactive, diagram of the pipeline and the results of its most current run.
Execute the PipelineYou can trigger the pipeline by committing a change to your repository, or by manually triggering it through the UI.
Success!You have successfully executed the sample go application pipeline! You can verify the results by viewing the binary and build created by the pipeline.
Navigate to Application | Artifactory | Artifacts and you will find your published binary under the go-local repository:
Navigate to Application | Artifactory | Builds to view your published
Explanation of pipeline definition
Let us now take a look at the pipeline definition files and what each section means.
The pipelines.yml file contains the templatized definition of your pipeline. This consists of the following:
- Resources are entities that contain information that is consumed or generated by pipeline steps. In our example, we use the following resources:
- A GitRepo resource pointing to the source control repository where your application code is present. You can configure this resource to trigger dependent steps on specific events.
- A BuildInfo resource that is a pointer to the Build on Artifactory. This is automatically created by the PublishBuildInfo step.
- Steps are executable units that form your pipeline. In our example, the pipeline consists of the following steps:
- A GoBuild native step that builds your Go project. This step is a pre-packaged step (i.e. native step) that is available to use with simple configuration and without the need for custom scripting. Detailed information on GoBuild is available here.
- A GoPublishBinary native step that publishes your Go binary to Artifactory. This step also does not need custom scripting. Detailed information on GoPublishBinary is available here
- A PublishBuildInfo step is a native step that gathers build metadata and pushes it to Artifactory. Artifactory Builds provide a manifest and include metadata about included modules, dependencies and other environment variables. Detailed information on PublishBuildInfo is available here.