Azure Pipelines in action pt. II - Working with templates
Templates are a key part of the pipeline system. Here's a few practices of mine that have helped keep the tangle manageable over the years
As mentioned, in my current org we're heavy users of the template system in Azure Devops. Since we have dozens of projects with similar needs, we've found templates to be the key for enabling a graceful evolution of automation capabilities.
Templates allow you to create your own "functions" to contain shared logic as a collection of stages, jobs and steps.
Here's a simple template:
# public/hello.yml
steps:
- script: echo "Hi!"
and how to use it from another repo:
resources:
repositories:
- repository: templates
type: git
name: ProjectName/RepoName
ref: refs/heads/release/v1
jobs:
- job:
steps:
# note the @templates as a resource reference
- template: public/hello.yml@templates
Here's a brief listing of key points I've discovered along the years that might be useful to others as well.
Templates have no dependsOn
by default
You might expect that, in order to run a template after some other step in your pipeline, you could do:
jobs:
- job: JobA
...
- template: jobs/JobB.yml@templates
dependsOn: JobA
parameters:
param1: 'hey!'
If only...
This is not a supported keyword, so you'll need to do this yourself. Practically all my new templates start with this:
parameters:
- name: dependsOn
type: object
default: []
...
stages:
- stage:
${{ if not(eq(length(parameters.dependsOn), 0))}}:
dependsOn: ${{ parameters.dependsOn }}
This turns the above syntax to:
jobs:
- job: JobA
...
- template: jobs/JobB.yml@templates
parameters:
dependsOn: JobA
param1: 'hey!'
Not a huge pain, but one I feel is completely unnecessary.
Giving a runtime name to a template
In order to depend on a stage/job it needs a name. Here's another staple in almost all my templates:
parameters:
- name: jobName
type: string
default: BuildDotnetCoreApp
jobs:
- job: ${{parameters.jobName}}
Now the calling pipeline can define any arbitrary name and depend on that. This is very useful e.g. when the same template needs to be run multiple times, since that either requires the jobs have no name at all, or the names must must be unique.
Using variables in a template
Variables defined on a higher level are available for the subsequent levels, so one defined on the pipeline root level is available inside all referenced templates.
Note: if your pipeline depends on specific variables, remember to document these upfront. Variable templates can be used for grouping and therefore documenting variables.
Although it is correct syntax to have top level variables in a template, it often results in runtime errors:
/internal/dotnet/jobs/prepare-artifact.yml (Line: 3, Col: 1): Unexpected value 'variables'
This breaks so often I never even try to it anymore. The circumstances under which it could work are unclear to me at the moment.
Variable groups
For example deployment jobs often need secrets that can be loaded from Variable groups. The variables loaded from a group in one level are freely available on all child steps, e.g. loading a variable group on a stage gives them to each job inside that stage. This can be useful or harmful depending on context, but I recommend you expose the variables only where you need them.
Here's how variable groups can be used in a template, this one using FileTransform@2 to update an appsettings.json file from one or more variable groups:
parameters:
# where in the current agent is the artifact zip file located
- name: artifactZipPath
type: string
# appsettings file path, relative to artifact root, to FileTransform@2
- name: appSettingsFileToTransform
type: string
- name: variableGroups
type: object
default: []
jobs:
- job:
variables:
# load multiple groups
- ${{ each g in parameters.variableGroups }}:
- group: ${{g}}
steps:
#assumes that the artifact has been loaded earlier in the pipeline
- checkout: none
- task: FileTransform@2
inputs:
folderPath: ${{ parameters.artifactZipPath }}
jsonTargetFiles: ${{ parameters.appSettingsFileToTransform }}
# these two needed to not mess up any other files..
enableXmlTransform: false
xmlTransformationRules: ''
You might need more than one group here if, for example, one group contains secrets from Key vault and another the basic app config in environment variables.
Controlling the API - public and private templates
When your library of templates grows, you'll probably notice that maintentance and versioning becomes difficult. Working with templates is the same as with any APIs - you need to keep a stable outwards facing interface but need to be able to easily improve the internals.
For our template repo we've opted to have two root level directories - public and internal. The public directory contains the stable implementations of jobs and stages, and any breaking changes to these come with a whole new release version. The pipelines on the internal side can (and do!) change in whatever way the maintainers need. People can of course refer to these templates too, but no guarantees are given as to their stability.
For an example on why this kind of split is necessary, just earlier today I implemented a change in how looped jobs were done
From:
- stage:
jobs:
- ${{ each step in parameters.steps }}:
- job: internal/jobs/doThing.yml
parameters:
jobName: DEPLOY_${{step.name}}
param: ${{step.param}}
other: ${{parameters.fromGlobal}}
To:
- stage:
jobs:
- job: internal/jobs/doThing.yml
parameters:
steps: ${{parameter.steps}}
other: ${{parameters.fromGlobal}}
Had the doThing
been a public resource, this would have been a breaking change, and we'd have had to do another version release and updated any projects that used this. Instead, as the public parameters stayed the same, we could evolve our implementation while staying in the same version while still containing the logic in a reusable internal template.
Wrapping up
Templates are a powerful tool for creating reusable blocks of pipeline functionality, and I've found them a must for keeping a large number of pipelines in check. There are some gotchas, but overall the system is robust and flexible enough. The documentation is good, but the design limitations are naturally not explicitly mentioned.
If you have a lot of projects to manage, paying attention to the "API design and versioning" pays off practically instantly and allows you to move everything at a controlled pace.