An Azure DevOps Workflow for Terraform

Daniel Boesswetter
9 min readNov 18, 2020

Azure DevOps and Terraform are a great team once you find out how. To avoid confusion: this article is not about the Azure DevOps provider for Terraform but about running Terraform from Azure DevOps Pipelines.

Hashicorp’s Terraform is the de-facto standard for Infrastructure as Code (IaC) and it is actively supported by Microsoft as a tool for Microsoft Azure. Azure DevOps (Server or Services) is a Microsoft platform which includes a project management tool called Boards (vaguely comparable to Jira or Trello), a git repository with a web interface called Repos (similar to GitHub or GitLab) and a CI/CD tool called Pipelines (comparable with Jenkins/GitLab-CI and others). It is not integrated into the Azure Cloud itself and can be used independently, even for free if five users are enough for you. Terraform is also supported in Azure DevOps since Microsoft provides „tasks“ — the building blocks of pipelines — which can run Terraform as part of the pipeline for you. This integration makes it possible to run Terraform against an Azure subscription (or any other supported cloud provider for that matter) without the hassle of secret management because the aforementioned tasks can access any service connection which the user has configured for the Azure DevOps project, including the ones to cloud providers.

That being said, it may sound like a great solution and it actually is, but since it is quite new to the market, there is not yet such a wealth of best-practices as for other technologies. Although there is a lot of documentation, it took me a while to get the workflow that I wanted and know from other tech-stacks. Here’s what I came up with.

If you google for Terraform and Azure DevOps, you will find a couple of articles which show you the way, first and foremost the Azure DevOps Lab on this subject, which has also been published on Microsoft’s website. However, the proposed solution is not what I would use in production and also I have the impression that it is based on a slightly older version of Azure DevOps when release pipelines could not yet be configured via YAML. Here’s why:

  • The labe proposes to pack all Terraform files into an Artifact and then run Terraform from a release pipeline, which at that time was still something that you needed to click together on the web interface of Pipelines. I definitely prefer a textual description of the pipeline which can be under version control. Luckily, it seems like everything that you can click in release pipelines can be exported into YAML and copied into your azure-pipelines.yml. However, I found this out by trying (and erroring)
  • Also, Terraform can wreak havoc on your production infrastructure if you make mistakes (and you will). So you usually want to review the plan before it is run against production.
  • Before you even think about merging your changes, you usually want to test them in a different subscription or resource-group. Maybe you even want that every developer can spin up a copy of the infra (with less and smaller resources maybe) by creating a branch or merge request in your IaC git repo. Since it is not the same to run Terraform against an empty resource-group as it is to run against existing resources, you probably want to test the latter in a staging system before continuing to production.

And needless to say, the YAML code of the pipeline should be DRY, so I do not want to copy the definition of the test-deployment stage and modify it to work with production.

The good news is, that all of this can be done, but it is not obvious (at least for me). Here‘s what I did.

Declaring a Pipeline in YAML

If your repository lives in Azure DevOps Repos, you can add a file called azure-pipelines.yml to the root of the repo and a git-push will automatically trigger an Azure DevOps Pipeline run. Depending on the contents of this file this can be only for certain events and for certain branches. You can either edit the file in a text editor or use the web interface which will help you with a list of tasks that you can pull into your pipeline. However, neither the list of tasks in the Pipelines web interface nor the documentation mention Terraform. Of course you can run Terraform in a bash or PowerShell script but then you need to take care of the secrets yourself, i.e. you need to create a user or service principal and copy its credentials into a secret variable in a variable group and then use those variables in your Terraform script. Moreover, although Terraform is installed on the ubuntu-18 image, it is certainly not the version that you want, so you’ll spend time figuring out how to automate the installation of a specific Terraform version. All of this is handled by the aforementioned task, so I’ll stick to it.

(Sidenote: Another way to run Terraform without the task is to use an AzureCLI task which should provide the right credentials through a service connection. However I could not make this work, although Terraform gives a helpful error message and points to its documentation).

As seen in the Azure DevOps Labs article, Terraform is supported in release pipelines. Those however are not stored in a YAML file in your repository. If you go into Releases and create a pipeline, you will notice that each task that you can configure there has an „YAML“ button which shows you the corresponding YAML code. The good news it: you can copy this and paste it into your azure-pipelines.yml, although I found no official documentation which supports this. This allows you to easily install an arbitrary Terraform version in your VM and run it with credentials obtained from the specified service connection (which needs to be created manually in the Azure DevOps project). An example pipeline looks like this:

pool:
vmImage: 'ubuntu-18.04'
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller@0
displayName: "Install Terraform 0.13.4"
inputs:
terraformVersion: 0.13.4
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1@0
displayName: "terraform init"
inputs:
command: init
workingDirectory: terraform
backendServiceArm: my-service-conn
backendAzureRmResourceGroupName: tooling
backendAzureRmStorageAccountName: tooling-sa
backendAzureRmContainerName: terraform-state
backendAzureRmKey: /terraform.tfstate

I found out how to use these tasks by configuring a release pipeline in the web UI and then exporting the steps as YAML. One downside of this task is that terraform workspace is not yet supported, but since the task is open source, you can create a pull-request if you need it.

Implementing User Approvals

Most CI/CD tools support some form of user approvals or also four eye principles and other policies. This also works in Azure DevOps, but again it is not obvious how to do it. The only place where you can configure approvals is in the environments under Azure DevOps Pipelines (in the web UI). There you can create e.g. a production environment and configure it to require a certain number of approvals before deploying. However, how does this connect with your pipeline? The answer is: through a deployment job, a special kind of job which supports things like rolling updates, canary releases but also simpler „runOnce“ steps. The YAML schema for a deployment job includes an „environment“ field which can contain the name of one of the environments that you previously created. The funny thing though is that the request for approval will not pop up before the job is run, but before the entire stage is run which contains the deployment job. To make use of this in our context, we need at least two stages:

  • One stage which runs unconditionally and will create the Terraform plan.
  • Another stage which contains the Terraform apply but waits for the approval of the user (who will first need to check the result of the first stage)

The two stages will run in different VMs or at least the workspace is cleaned after the first stage. Consequently we need to transfer the plan from stage 1 to stage 2 in the form of an artifact. Artifacts can be published to later stages with the publish task. This is also done in the aforementioned Microsoft article.

If you want to exclude certain files from ending up in the Artifact, you can put their names (or patterns) into the .artifactignore file in your repo. According to the documentation, the absence of this file will lead to .git being ignored and everything else to be published. However it it turned out that other dot files and directories seem to be ignored as well, so also the .terraform (which is the result of Terraform init) does not show up in stage two. That‘s why I simply create a TGZ archive with the initialised Terraform folder and the plan, publish it as an Artifact and unpack it in the receiving stage:

- script: tar Ccvzf $(System.DefaultWorkingDirectory) artifact.tgz\
terraform
displayName: Create the Artifact
- publish: $(System.DefaultWorkingDirectory)/artifact.tgz
artifact: artifact

The request for approval might not show up on the page where you observe the progress of the pipeline, so make sure to go to the front-page when you have seen the plan. There you will see a button for approving the deployment:

Running against different Environments

When a pipeline runs, certain variables are set, e.g. the name of the git branch which is build. We want to make certain things depend on this. However, taking decisions based on the value of variables is not straightforward in Pipelines. Firstly there are conditions, expressions which can enable or disable a step, job or stage. So one way would be to copy and paste our code so far and run one copy for the Master branch and the other copy for non-master. However, this contradicts my desire for DRY code. Option number two is called „conditional insertion“, which allows us to activate or deactivate exactly one line of YAML code (not a block, just one line) depending on a condition. There is unfortunately no ternary operator or if/else construct which would allow us to set variables to one of two options depending on the branch name.

There is another concept which comes to rescue here: templates. A template is a YAML fragment which is stored in a separate file and can be included into the main azure-pipeline.yml while optionally replacing parameters with given arguments. This can be done for variable definitions, stages, jobs or steps. This gives us two ways to solve our problem:

  • Either we put the two stages described above into the main file (azure-pipelines.yml) and use variables for the things that we want to be different between branches (e.g. the name of the tfvars-file that we pass to terraform or even the name of the service connection that we use). Above all, we load a variables template, depending on the branch that we build (with conditional insertion). Either we load var-prod.yml for master-builds or var-test.yml for non-master builds:
variables:
- ${{ if eq(variables['Build.SourceBranch'], 'refs/heads/master') }}:
- template: variables-master.yml
- ${{ if ne(variables['Build.SourceBranch'], 'refs/heads/master') }}:
- template: variables-branch.ym
  • The second option is to put the two stages into a template with parameters and call them from the main file. We can use conditions to enable/disable the calls with values for master-builds and non-master-builds. I did not try this so far, because I think it is just a matter of taste.

By running the pipeline with different variables depending on whether we’re in master or not, also allows us to use different environments in our deployment job, so an approval is necessary only in production, but not in testing.

Putting it all together

I ended up with the following azure-pipelines.yml. It does what I wanted, although the usability is not perfect yet. I’d like to have something as the Terraform workflow proposed for GitLab-CI but still lack the experience with Azure DevOps to tackle this.

pool:
vmImage: 'ubuntu-18.04'
# the VS Code plugin will claim that this is not valid, but Azure DevOps understands what I mean:
variables:
- ${{ if eq(variables['Build.SourceBranch'], 'refs/heads/master') }}:
- template: variables-master.yml
- ${{ if ne(variables['Build.SourceBranch'], 'refs/heads/master') }}:
- template: variables-branch.yml
stages:
- stage: terraform_preparation
displayName: "Terraform installation, init & plan"
jobs:
- job: setup_terraform
displayName: "terraform installation, init & plan"
steps:
- checkout: self
fetchDepth: 1
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller@0
displayName: "Install Terraform $(terraformVersion)"
inputs:
terraformVersion: $(terraformVersion)
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1@0
displayName: "terraform init"
inputs:
command: init
commandOptions: -input=false -var-file=$(terraformVarFile)
workingDirectory: '$(System.DefaultWorkingDirectory)/terraform'
backendServiceArm: $(backendServiceArm)
backendAzureRmResourceGroupName: $(backendAzureRmResourceGroupName)
backendAzureRmStorageAccountName: $(backendAzureRmStorageAccountName)
backendAzureRmContainerName: $(backendAzureRmContainerName)
backendAzureRmKey: $(backendAzureRmKey)
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1@0
name: terraform_plan
displayName: 'terraform plan'
inputs:
command: plan
commandOptions: -out=terraform_plan.out -input=false -var-file=$(terraformVarFile)
workingDirectory: '$(System.DefaultWorkingDirectory)/terraform'
environmentServiceNameAzureRM: $(environmentServiceNameAzureRM)
# Apparently .terraform is excluded from the normal publishing process, so we need to create an archive
- script: tar Ccvzf $(System.DefaultWorkingDirectory) artifact.tgz terraform
displayName: Create the Artifact
- publish: $(System.DefaultWorkingDirectory)/artifact.tgz
artifact: artifact
- stage: deployment
displayName: Deployment
jobs:
- deployment: terraform_apply
displayName: "Terraform apply"
environment: $(environment)
strategy:
runOnce:
deploy:
steps:
# We need to install terraform here again. Although the latest terraform seems to be available in the image
# we want control over the exact version that we use.
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller@0
displayName: "Install Terraform $(terraformVersion)"
inputs:
terraformVersion: $(terraformVersion)
- script: tar Cxvzf $(System.DefaultWorkingDirectory) $(Pipeline.Workspace)/artifact/artifact.tgz
displayName: Unpack the Artifact
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1@0
name: terraform_apply
displayName: 'terraform apply'
inputs:
command: apply
commandOptions: -input=false terraform_plan.out
workingDirectory: '$(System.DefaultWorkingDirectory)/terraform'
environmentServiceNameAzureRM: $(environmentServiceNameAzureRM)

Note that I actually run Ansible from the same pipeline but I omit that code for brevity here.

Disclaimer

I am using the free tier of Azure DevOps where only one job can run at a time. If you use the paid version, you might run into race conditioins which I did not have so far.

--

--

Daniel Boesswetter

DevOps and Site Reliability Engineer, Daddy, Runner, Photographer, Herbivore