Tutorial : Continuous Delivery in the Cloud Part 2 of 6
In part 1 of this series, I introduced the Continuous Delivery (CD) pipeline for the Manatee Tracking application and how we use this pipeline to deliver software from checkin to production. In this article I will take an in-depth look at the CD pipeline. A list of topics for each of the articles is summarized below.
Part 1: Introduction – Introduction to continuous delivery in the cloud and the rest of the articles; Part 2: CD Pipeline – What you’re reading now; Part 3: CloudFormation – Scripted virtual resource provisioning; Part 4: Dynamic Configuration – “Property file less” infrastructure; Part 5: Deployment Automation – Scripted deployment orchestration; Part 6: Infrastructure Automation – Scripted environment provisioning (Infrastructure Automation)
The CD pipeline consists of five Jenkins jobs. These jobs are configured to run one after the other. If any one of the jobs fail, the pipeline fails and that release candidate cannot be released to production. The five Jenkins jobs are listed below (further details of these jobs are provided later in the article).
- A job that set the variables used throughout the pipeline (SetupVariables)
- Build job (Build)
- Production database update job (StoreLatestProductionData)
- Target environment creation job (CreateTargetEnvironment)
- A deployment job (DeployManateeApplication) which enables a one-click deployment into production.
We used Jenkins plugins to add additional features to the core Jenkins configuration. You can extend the standard Jenkins setup by using Jenkins plugins. A list of the plugins we use for the Sea to Shore Alliance Continuous Delivery configuration are listed below.
- Grails: http://updates.jenkins-ci.org/download/plugins/grails/1.5/grails.hpi****
- Groovy: http://updates.jenkins-ci.org/download/plugins/groovy/1.12/groovy.hpi****
- Subversion: http://updates.jenkins-ci.org/download/plugins/subversion/1.40/subversion.hpi****
- Paramterized Trigger: http://updates.jenkins-ci.org/download/plugins/parameterized-trigger/2.15/parameterized-trigger.hpi****
- Copy Artifact: http://updates.jenkins-ci.org/download/plugins/copyartifact/1.21/copyartifact.hpi****
- Build Pipeline: http://updates.jenkins-ci.org/download/plugins/build-pipeline-plugin/1.2.3/build-pipeline-plugin.hpi****
- Ant: http://updates.jenkins-ci.org/download/plugins/ant/1.1/ant.hpi****
- S3: http://updates.jenkins-ci.org/download/plugins/s3/0.2.0/s3.hpi
The parameterized trigger, build pipeline and S3 plugins are used for moving the application through the pipeline jobs. The Ant, Groovy, and Grails plugins are used for running the build for the application code. Subversion for polling and checking out from version control.
Below, I describe each of the jobs that make up the CD pipeline in greater detail.
SetupVariables: Jenkins job used for entering in necessary property values which are propagated along the rest of the pipeline.
Parameter: STACK_NAME
Type: String
Where: Used in both CreateTargetEnvironment and DeployManateeApplication jobs
Purpose: Defines the CloudFormation Stack name and SimpleDB property domain associated with the CloudFormation stack.
Parameter: HOST
Type: String
Where: Used in both CreateTargetEnvironment and DeployManateeApplication jobs
Purpose: Defines the CNAME of the domain created in the CreateTargetEnvironment job. The DeployManateeApplication job uses it when it dynamically creates configuration files. For instance, in test.oneclickdeployment.com, test would be the HOST
Parameter: PRODUCTION_IP* Type: String Where: Used in the StoreProductionData job Purpose**: Sets the production IP for the job so that it can SSH into the existing production environment and run a database script that exports the data and uploads it to S3.
Parameter: deployToProduction Type: Boolean Where: Used in both CreateTargetEnvironment and DeployManateeApplication jobs Purpose: Determines whether to use the development or production SSH keypair.
In order for the parameters to propagate through the pipeline, we pass the current build parameters using the parametrized build trigger plugin
Build: Compiles the Manatee application’s Grails source code and creates a WAR file.
To do this, we utilize a Jenkins grails plugin and run grails targets such as compile
and prod war
. Next, we archive the grails migrations for use in the DeployManateeApplication job and then the job pushes the Manatee WAR up to S3 which is used as an artifact repository.
Lastly, using the trigger parametrized build plugin, we trigger the StoreProductionData job with the current build parameters.
StoreProductionData: This job performs a pg dump (PostgreSQL dump) of the production database and then stores it up in S3 for the environment creation job to use when building up the environment. Below is a snippet from this job.
1
|
|
On the target environments created using the CD pipeline, a database script is stored. The script goes into the PostgreSQL database and runs a pg_dump. It then pushes the pg_dump SQL file to S3 to be used when creating the target environment.
After the SQL file is stored successfully, the CreateTargetEnvironment job is triggered.
CreateTargetEnvironment: Creates a new target environment using a CloudFormation template to create all the AWS resources and calls puppet to provision the environment itself from a base operating system to a fully working target environment ready for deployment. Below is a snippet from this job.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
|
Once the environment is created, a set of Cucumber tests is run to ensure it’s in the correct working state. If any test fails, the entire pipeline fails and the developer is notified something went wrong. Otherwise if it passes, the DeployManateeApplication job is kicked off and an AWS SNS email notification with information to access the new instance is sent to the developer.
DeployManateeApplication: Runs a Capistrano script which uses steps in order to coordinate the deployment. A snippet from this job is displayed below.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
|
This deployment job is the final piece of the delivery pipeline, it pulls together all of the pieces created in the previous jobs to successfully deliver working software.
During the deployment, the Capistrano script SSH’s into the target server, deploys the new war and updated configuration changes and restarts all services. Then the Cucumber tests are run to ensure the application is available and running successfully. Assuming the tests pass, an AWS SNS email gets dispatched to the developer with information on how to access their new development application
We use Jenkins as the orchestrator of the pipeline. Jenkins executes a set of scripts and passes around parameters as it runs each job. Because of the role Jenkins plays, we want to make sure it’s treated the same way as application – meaning versioning and testing all of our changes to the system. For example, if a developer modifies the create environment job configuration, we want to have the ability to revert back if necessary. Due to this requirement we version the Jenkins configuration. The jobs, plugins and main configuration. To do this, a script is executed each hour using cron.hourly
that checks for new jobs or updated configuration and commits them up to version control.
The CD pipeline that we have built for the Manatee application enables any change in the application, infrastructure, database or configuration to move through to production seamlessly using automation. This allows any new features, security fixes, etc. to be fully tested as it gets delivered to production at the click of a button.
In the next part of our series – which is all about using CloudFormation – we’ll go through a CloudFormation template used to automate the creation of a Jenkins environment. In this next article, you’ll see how CloudFormation procures AWS resources and provisions our Jenkins CD Pipeline environment.