Automated canary releasing with Jenkins on DC/OS with Vamp: Part 1

In this series of articles we will dive into all the steps necessary to get a started with canary releasing in a typical Jenkins CI/CD pipeline

  1. Set up a Jenkins instance on DC/OS to talk to Vamp.

  2. Whip up a simple test application and push it to Github.

  3. Wrap your apps in a Docker container

  4. Create a Vamp “base” blueprint.

  5. Build a script to call the correct Vamp REST endpoints.

  6. Run build jobs to iterate on our app and release them “à la canary”.

  7. Perform complex, multi-version deployments.


To get started, you need to have a DC/OS cluster. There are many ways to go about this, but we like and use the following options:

Next up, install Vamp on DC/OS using the available Universe package or follow our installation guides for a manual setup. At the end, you should have Vamp running as a DC/OS service and you can access the Vamp UI by clicking on the link in the service overview.

Finally, install Jenkins, also available as a Universe package, and pin it to a specific host so we don’t have to worry about shared storage.

Building an application with Jenkins

For demo purposes we will use the (extremely) simple, a one page Node.js “app” that allows us to explore multiple deployment scenarios.

  • Fork the repo into your own account and then clone a local copy to your machine.

  • Install the Jenkins NodeJs and docker-build-step plugins from the Manage Jenkins tab. After installing, your plugin tab should have the following entries:

  • Configure both plugins so the latest Node.Js version and Docker versions are installed. The configuration tabs should look as follows:

  • As we will be pushing Docker containers, add your credentials for your Docker hub repo on the Credentials tab, accessible from the dashboard. Give the credential set a descriptive ID, like “docker-hub-login”

The next step is to create a Jenkins Pipeline that will execute all our build, test and deploy steps. Jenkins pipelines allow you to script discrete steps in the CI/CD pipeline and commit this to a Jenkinsfile and store it in your Git repo. Let’s start:

  • Create a new Pipeline project from the dashboard, just call it “simple service pipeline”.

  • Set the Pipeline Definition to “Pipeline script” and paste in the following Groovy script. Be careful to replace the variables gitRepo dockerHub dockerHubCreds dockerRepo and dockerImageName with your own settings.

The above script defines all the stages in our pipeline, from checkout, via install and test, to building & pushing the resulting Docker image and performing some cleanup.

Important to notice here is that we tag our Docker image based on the value of version in the package.json .We store that value in appVersion and use it to push our Docker image. This will become important later when we need to devise a versioning strategy for canary releasing new versions onto existing versions.

You should now be all set to trigger a first build by clicking the Build Now button. If everything your pipeline’s dashboard will light up green and the console output of the build job should be similar to the shortened version that is listed below.

Jenkins pipeline project stage viewJenkins pipeline project stage view

[Pipeline] {
[Pipeline] tool
Unpacking [](
[Pipeline] { (Checkout)
[Pipeline] git
Cloning the remote Git repository
[Pipeline] { (Install)
+ npm install
added 256 packages in 9.27s
[Pipeline] { (Test)
+ npm test
1 tests complete
Test duration: 290 ms
[Pipeline] { (Build Docker image)
+ docker build -t magneticio/simpleservice .
Successfully built d602da61bbe7
[Pipeline] { (Push Docker image)
+ docker push
+ docker push
[Pipeline] { (Cleanup)
+ rm node_modules -rf
[Pipeline] End of Pipeline
Finished: SUCCESS

Deploying to Vamp

Great! We are now building containers based on our source code and pushing them to Docker hub. Let’s start deploying our freshly backed Docker container to our Vamp instance. But before we continue, some observations about our build pipeline:

  1. We’re not triggering builds yet on commit, but this is trivial to add later.

  2. We’re using Node.Js, but you can of course use any language and/or platform. The build artefacts just need to be shipped in a Docker container.

With that out of the way, we need to get Jenkins talking to Vamp, for this we need to get the Vamp service endpoint in the DC/OS network. Grab it from the service configuration tab. In my case this is

Using this endpoint, we can talk to Vamp inside the DC/OS cluster, without having to provide any credentials. That is exactly what we are doing in this additional stage to our pipeline script.

This script does the following:

  1. Sets the vampHost and vampDeploymentName variables.

  2. Adds the Deploy stage to the set of stages.

  3. POST the vamp_blueprint.yml blueprint file from our repo to Vamp. This blueprint is the initial starting point for our application. Read more about what blueprints are an how they work.

  4. It creates a deployment based on this blueprint using a PUT

Update the pipeline script with the new stage (here is a link the full version of the new pipeline script) and run the build again. You should find a newly added blueprint and deployment in your Vamp UI.

Vamp showing the initial deploymentVamp showing the initial deployment

Also, you should find an additional stage added to the Stage view dashboard, giving you a nice, semi real-time, overview of all stages in the full CI/CD pipeline.

In the Vamp UI, go to the simpleservice gateway marked as “external” (Gateways → simpleservice/simpleservice/web) and click on the host/port link. This should pull out a sidebox that shows the output of port 3000 of our app: a single, blue HTML page with a short message and the current version number.

Wrap up

Congratulations! You’ve worked through all of the tedious installation and setup stuff and are now ready to really start fleshing out this CI/CD pipeline and explore some of the rather cool functions Vamp offers in conjunction with DC/OS and Jenkins.

What's next?