Continuous Integration (CI) is the software development practice of merging code changes into the main branch as often as possible, then validating by creating a build and running automated tests against the build.
This practice of validating incremental changes as they are made helps assure ongoing functional quality, and avoids the complex challenges that arise from a large merge of accumulated code changes. Continuous Integration workflows help software teams ensure that their changes are built and tested with the latest version of the entire codebase as those changes are made. As a result, most bugs are found almost immediately after the code change is committed, leading to better quality since each bug can be easily isolated to a specific code change and fixed promptly.
The chief objective of a CI worfklow is to produce a build that is ready for delivery to the first stage of evaluation. As part of the JFrog Platform, Pipelines helps assure those builds include all the metadata that Artifactory and XRay enables, so they are fully traceable, searchable, and secure.
A Pipelines CI workflow that fully utilizes the JFrog Platform may perform the following sequence for a containerized build:
- A new commit is made to a source code repository, such as in GitHub.
- GitHub notifies Pipelines (through a webhook) of the change.
- Pipelines commences automatic execution of the CI workflow.
- The worfklow distributes the steps that create a build to available execution nodes, where each step executes in a runtime container provisioned to meet the needs of the step.
- All intermediate binaries produced by the steps are stored in Artifactory repositories with build information.
- If successful, the build produced by the workflow is pushed to a Docker repository in Artifactory, with build information.
- The completed build is scanned by XRay to identify security vulnerabilities and compliance with license policies.
Continous Delivery (CD) is the software operations practice of automating the delivery of builds to selected infrastructure environments such as for development, test, and production.
A CD workflow automates the delivery and validation of completed builds through the various stages for each set of stakeholders in the DevOps process. It helps ensure that code released into production is of high quality, and can greatly speed the update process so that the most reliable software version is always running.
As part of the JFrog Platform, Pipelines also helps teams follow best DevOps practices of build promotion enabled by Artifactory, and the accumulation of metadata as builds pass through each stage of evaluation.
A Pipelines CD workflow that fully utilizes the JFrog Platform may perform the following sequence for a containerized build:
- A CI workflow in Pipelines produces a new build.
- Successful completion of the CI workflow triggers execution of the CD workflow.
- The Pipelines CD workflow steps execute to:
- Provision a development environment on infrastructure, deploy the build and validate with development-level tests, adding metadata on success.
- Promote the build to a test stage repository, provision a test environment, run QA tests on the build, adding metadata.
- Promote the build to a production repository
- If needed, the Pipelines CD workflow can automatically update production environments with the newly validated build.
To support DevOps teams, IT Operations need to automate activities like infrastructure provisioning, image building, and security patching, to avoid configuration drift, ensure predictability and repeatability, and make your software delivery process much more reliable. For example, intermediate environments like Test and Staging should be perfect reproductions of production so you can catch all bugs that might occur in production and easily reproduce them in any environment. Your infrastructure should be provisioned exactly the same way each time to avoid problems due to config errors.
Pipelines helps you automate IT Operations by providing the following functionality:
- Integrations with popular tools like Terraform, Ansible, Chef, and Puppet to help automate configuratin of your environments. You can store your provisioning scripts in your source control repository and any time your scripts change, the environment is updated and this triggers the rest of your DevOps workflow. You can even manage your VPCs and Networking configuration as code.
- You can easily create a dependency tree of all applications that are deployed into your environments. If there is an update to an environment, applications can automatically be re-deployed if needed.
- You can easily transfer information like subnet_id, security_group_id to downstream activities. for e.g. EC2 provisioners as part of your workflow. No more manual copy-paste.
- You can templatize your scripts and inject variables at runtime to make sure the right context is set for each execution.
- Pipelines offers in-built state, so if you need to store information such as Terraform state files, you don't need to maintain it someplace else. It's stored as part of your workflow and available to any job that needs it.
- You have a complete history of your config and reproducing a previous config is a one click rollback action.
- Your infrastructure provisioning can be part of your overall DevOps workflow, so you can implement advanced scenarios like configuring Pipelines to bring up on-demand Test environments when a new version of your application is available. You can also automate triggering your tests automatically and tearing down your test environment if all tests pass.