Aria Automation Cloud Automation Code Stream DevOps vRealize Code Stream

Build, test and release VM images with vRealize Automation Code Stream and Packer

Packer is a powerful open source tool that can be used for automating the builds of your vSphere templates, AWS custom AMIs and Azure images, which can then be deployed with vRealize Automation Cloud Assembly. In this blog post and video I’ll demonstrate how to use vRealize Automation Code Stream to automate the build, test and release of those templates.

Why automate template builds?

Well, can you always be sure that your templates are consistent, up to date with the latest patches, scanned and approved by your security tools?

Imagine a new critical vulnerability has been identified and patched, and you now have to patch your template, run a bunch of tests (e.g. security scans, QA testing, application teams testing) before you can release the template to be used in production. Using the tools showcased in this blog, it could be as simple as updating a Packer configuration file and committing that to your source control repository. Your commit can trigger a Code Stream pipeline that then takes over the build process.

The process for generating a new template has three phases – building the images, testing the images, and then releasing the template to production.

  • To build the images Code Stream clones the Packer configuration files from source control and executes the Packer build to create the new images. In this blog I am only building vSphere templates, but the same principle can be used for AWS and Azure.
  • The Test phase deploys the new images and then performs any automated tests (e.g. triggering a vulnerability scan) or waits for a manual testing group to approve the template (e.g. a member of a QA team logs on to run some checks).
  • As long as the images have been approved in the Test phase, the Release phase updates the current production template images with the newly built and tested images.

Automating this build, test and release loop allows us to operate in a DevOps for Infrastructure pattern – if you want to read more about vRealize Automation and DevOps for Infrastructure, check out the DevOps for Infrastructure blog series (Part 1 – VI Admin to DevOps Champion, Part 2 – Infrastructure as Code and Part 3 – GitOps)

Demo Pipeline Build

Check out this short (~8m) demonstration of the Pipeline in action:

Configuring the Packer Build Code Stream Pipeline

If you’re interested in setting up your own automated Packer builds, you can grab my code from GitHub and modify to suit your own build/test/release processes.

Fork the Packer Templates GitHub Repository

While it’s possible to set this up with almost any Git repository, this blog post and example code uses GitHub, so if you want to follow along you’ll need your own GitHub account. Head over to the fielddemo-packer-templates repository and create a fork in your own account (hit the “Fork” button in the top right). When it’s forked you should see something like below:

Create a new local working copy of your Git repository on your workstation. I’m using Visual Studio Code on my Mac with it’s built in Git integrations, as well as the Git CLI. Be sure to clone your fork of the repository and not the original (hit the green Code button and copy the URL).

Configuring the Packer Templates

The Packer configurations in this repository are examples only, you should be familiar with Packer and modify them to suit your own use cases. I would strongly recommend getting your Packer builds working how you want them before attempting to automate the process.

The two main Packer files are all.json and variables.json.

variables.json

As the name suggests, the variables file provides variables that can be consumed by the packer builds. Some of the variables are populated from environment variables – this allows the Code Stream pipeline to have some level of customisation over the Packer build. When I’m testing the Packer builds, I create a copy of the variables file (e.g. variables.live.json) and update the variables to static values. This allows me to test my configuration locally before I run the build pipeline by specifying the -var-file=variables.live.json flag in my packer build command.

The http_server variable points to the https accessible config folder in my GitHub repository. Packer can create it’s own temporary HTTP server to make these files available to the builder, however if you’re running it as part of the Code Stream CI container it’s not possible to expose the HTTP server – the work-around is to use the GitHub (or any other) web server to provide these files.

This file is also where variables for the VM name are defined (although I append a build date – more on this later), the path to the ISO (this can be locally on a Datastore, like mine, or via a URL), and also the SHA256 checksum of the ISO.

– Update the ISO details and the HTTP server to match your environment.

all.json

In order to automate the build and allow the pipeline to specify a subset of builds, all of my Packer templates are defined in a single file – all.json. This allows me to use the -only flag to ensure only specific builders run. You can add and remove builder specifications to the builders array – just make sure you include a name property in your JSON specification.

I also use a few Packer provisioners to execute commands once the main builder has completed. These also use an only clause to ensure the correct scripts or commands are run on the correct builder.

As I mentioned in the variables.json section, http_server is used in the boot_command for my Linux builders as floppy support has been dropped by some newer bootloaders.

– Update the Builders and Provisioners to match your requirements

The Git repository also contains several folders:

  • code-stream – Code Stream pipeline definitions as a YAML file
  • config – Packer boot config files for the operating systems
  • drivers – Additional drivers required for the Windows Packer build
  • scripts – Scripts to be executed as part of the OS builds
  • ssh – SSH Public Key file to be used for SSH access

– Update the config/* files to match your requirements and configuration – there are user credentials in these files!

– Update the ssh/id_rsa.pub file to contain your own public key – mine will be useless to you!

– Update the scripts/* files to match your requirements and configuration.

Once again I would strongly recommend ensuring your Packer configuration is working manually before automating the process. I will also re-iterate that the Packer templates in this repository are examples only and you should develop your own build templates.

Configuring vRealize Automation

Prepare the environment

In order to build and test the Packer templates you will need to configure the following infrastructure:

  • a Docker host that is configured for remote API, and added to Code Stream as an endpoint
  • a vSphere Cloud Account configured in Cloud Assembly
    • with a correctly configured Cloud Zone
    • and Project that is able to deploy into the Cloud Zone

Importing the Pipeline

Edit code-stream/Packer-Template-Builds.yaml and update the project name to your environment.

Open Code Stream > Pipelines and click on the IMPORT button. Select READ FROM FILE and open the code-stream/Packer-Template-Builds.yaml file to import it. Ensure Operation Type is set to Create (which is default) and click IMPORT.

Configure the Pipeline

The newly imported pipeline will be disabled – open the pipeline to configure.

Select the Workspace tab and select your Docker host. The sammcgeown/codestream-ci-packer:latest image is publicly available on Docker Hub if you wish to use that, or you can build your own image.

Select the Input tab and edit the default values for your pipeline. If you plan to trigger the pipeline from a Git Webhook, you will need to populate all the values – alternatively you could migrate to using Code Stream Variables rather than pipeline inputs – I’ve used inputs for portability.

Finally, open the Model tab, then select each task and ensure the task validates (a green tick should appear, replacing a red exclamation):

With the pipeline validated, select ACTIONS then Enable:

The pipeline should now be runnable!

Next steps…

The Packer code and Code Stream pipeline in this example are fairly basic, but the principles can be applied to create much more complex Packer templates, and to automate highly complex build, test and release processes.

If you want to find out more about vRealize Automation please visit our website, or to learn more about our features and explore vRealize Automation Cloud get started with a free 45-day trial!