Home
/
Blogs
/
Azure DevOps Pipelines quick reference guide

Azure DevOps Pipelines quick reference guide

July 10, 2024, Abdush Miah

Summary

A quick reference guide for Azure DevOps Pipelines

Sample YAML file

name: 1.0$(Rev:.r)

# simplified trigger (implied branch)
trigger:

- main

# equivalents trigger
# trigger:
#  branches:
#    include:
#    - main

variables:
  name: John

pool:
  vmImage: ubuntu-latest

jobs:

- job: helloworld
  steps:
    - checkout: self
    - script: echo "Hello, $(name)"

Common pipeline components

Name

If it's skipped, a date-based name is generated automatically

You'll get an integer number if you don't explicitly set a name format. A monotonically increasing number of runs triggered off this pipeline, starting at 1. This number is stored in Azure DevOps. You can make use of this number by referencing:

$(Rev).

To make a date-based number, you can use the format:

$(Date:yyyyMMdd) ## to get a build number like: 20221003.

To get a semantic number like 1.0.x, you can use something like

1.0.$(Rev:.r)

Trigger

If there's no explicit triggers section, then it's implied that any commit to any path in any branch will trigger the pipeline to run.

You can be more precise by using filters such as branches or paths.

Example to trigger on main branch:

trigger:
  branches:
    include:

    - main

To trigger on all branches except one mentioned, example usage:

trigger:
  branches:
    exclude:

    - main

To trigger from any branch that begins with specific value as well as matching it against a specific path e.g branch name: "feature" and path includes "webapp":

trigger:
  branches:
    include:

    - feature/*
  paths:
    include:

    - webapp/**

You can use none if you never want your pipeline to trigger automatically. It's helpful if you're going to create a pipeline that is only manually triggered.

trigger: none

You can specify the target branches when validating your pull requests.

To validate pull requests that target main and releases/* and start a new run the first time a new pull request is created, and after every update made to the pull request:

pr:
    - main
    - releases/*

There are other triggers for other events, such as:

You can find all the documentation on triggers here.

Jobs

Every pipeline must have at least one job. A job is a set of steps an agent executes in a queue (or pool). Jobs are atomic – they're performed wholly on a single agent. You can configure the same job to run on multiple agents simultaneously, but even in this case, the entire set of steps in the job is run on every agent. You'll need two jobs if you need some steps to run on one agent and some on another.

A job has the following attributes besides its name:

NOTE: Jobs can run conditionally and might depend on previous jobs.

Dependencies

When you define multiple stages in a pipeline, by default, they run sequentially in the order in which you define them in the YAML file. The exception to this is when you add dependencies. With dependencies, stages run in the order of the dependsOn requirements.

Pipelines must contain at least one stage with no dependencies.

Because no dependsOn was specified, the jobs will run sequentially: first A and then B:

jobs:

- job: A
  steps:
  # steps omitted for brevity


- job: B
  steps:
  # steps omitted for brevity

To have both jobs run in parallel, we add dependsOn: [] to job B:

jobs:

- job: A
  steps:
  # steps omitted for brevity


- job: B
  dependsOn: [] # This removes the implicit dependency on the previous stage and causes this to run in parallel.
  steps:
  # steps omitted for brevity

Checkout

Jobs check out the repo they're contained in automatically unless you specify checkout: none. Deployment jobs don't automatically check out the repo, so you'll need to specify checkout: self for deployment jobs if you want access to the YAML file's repo.

Download

Jobs don't download anything unless you explicitly define a download. Deployment jobs implicitly do a download: current, which downloads any pipeline artifacts created in the existing pipeline. To prevent it, you must specify:

download: none.

Resources

If your job requires source code in another repository? You'll need to use resources. Resources let you reference:

To reference code in another repo, specify that repo in the resources section and then reference it via its alias in the checkout step:

resources:
  repositories:

  - repository: appcode
    type: git
    name: otherRepo

steps:

- checkout: appcode

Steps are Tasks

Steps are the actual "things" that execute in the order specified in the job.

Each step is a task: out-of-the-box (OOB) tasks come with Azure DevOps. Many have aliases and tasks installed on your Azure DevOps organization via the marketplace.

Variables

Every variable is a key: value pair. The key is the variable's name, and it has a value. Usage example:

variables:
  name: Abdush
steps:

- script: echo "Hello, $(name)!"

TODO: Put into a note component NOTE: You can get the name of the branch from the variables Build.SourceBranch (for the full name like refs/heads/main) or Build.SourceBranchName (for the short name like main).

Stages

A stage is one or more jobs, units of work assignable to the same machine.

You can arrange both stages and jobs into dependency graphs. Examples include "Run this stage before that one" and "This job depends on the output of that job."

A job is a linear series of steps. Steps can be tasks, scripts, or references to external templates.

This hierarchy is reflected in the structure of a YAML file like:

Pipeline
Stage A
Job 1
Step 1.1
Step 1.2
...
Job 2
Step 2.1
Step 2.2
...
Stage B
...

Tips

If you have a single-stage, you can omit the stages keyword and directly specify the jobs keyword:

# ... other pipeline-level keywords
jobs: [ job | templateReference ]

If you've a single-stage and a single job, you can omit the stages and jobs keywords and directly specify the steps keyword:

# ... other pipeline-level keywords
steps: [ script | bash | pwsh | powershell | checkout | task | templateReference ]

This example runs three stages, one after another. The middle stage runs two jobs in parallel:

stages:

- stage: Build
  jobs:

  - job: BuildJob
    steps:

    - script: echo Building!
- stage: Test
  dependsOn: Build
  jobs:

  - job: TestOnWindows
    steps:

    - script: echo Testing on Windows!
  - job: TestOnLinux
    steps:

    - script: echo Testing on Linux!
- stage: Deploy
  dependsOn: Test
  jobs:

  - job: Deploy
    steps:

    - script: echo Deploying the code!

Deployment strategies Deployment strategies allow you to use specific techniques to deliver updates when deploying your application.

For details and examples, see Deployment jobs

As you gain more confidence in the new version, you can release it to more servers in your infrastructure and route more traffic to it.

Lifecycle hooks

You can achieve the deployment strategies technique by using lifecycle hooks. Depending on the pool attribute, each resolves into an agent or server job.

Lifecycle hooks inherit the pool specified by the deployment job. Deployment jobs use the $(Pipeline.Workspace) system variable.

Available lifecycle hooks:

You can achieve the deployment strategies technique by using lifecycle hooks. Depending on the pool attribute, each resolves into an agent or server job.

Lifecycle hooks inherit the pool specified by the deployment job. Deployment jobs use the $(Pipeline.Workspace) system variable.

Available lifecycle hooks:

Steps

A step is a linear sequence of operations that make up a job. Each step runs its process on an agent and accesses the pipeline workspace on a local hard drive.

This behavior means environment variables aren't preserved between steps, but file system changes are.

Template References

You can put reusable sections of your pipeline to a seperate file, known as templates.

Supported templates:

Stage template:

# File: stages/test.yml

parameters:
  name: ''
  testFile: ''

stages:

- stage: Test_${{ parameters.name }}
  jobs:

  - job: ${{ parameters.name }}_Windows
    pool:
      vmImage: windows-latest
    steps:

    - script: npm install
    - script: npm test -- --file=${{ parameters.testFile }}

  - job: ${{ parameters.name }}_Mac
    pool:
      vmImage: macOS-latest
    steps:

    - script: npm install
    - script: npm test -- --file=${{ parameters.testFile }}

templated pipeline:

# File: azure-pipelines.yml

stages:

- template: stages/test.yml  # Template reference
  parameters:
    name: Mini
    testFile: tests/miniSuite.js


- template: stages/test.yml  # Template reference
  parameters:
    name: Full
    testFile: tests/fullSuite.js

Job templates

Single job repeated on three platforms:

# File: jobs/build.yml

parameters:
  name: ''
  pool: ''
  sign: false

jobs:

- job: ${{ parameters.name }}
  pool: ${{ parameters.pool }}
  steps:

  - script: npm install
  - script: npm test

  - ${{ if eq(parameters.sign, 'true') }}:
    - script: sign
# File: azure-pipelines.yml

jobs:

- template: jobs/build.yml  # Template reference
  parameters:
    name: macOS
    pool:
      vmImage: 'macOS-latest'


- template: jobs/build.yml  # Template reference
  parameters:
    name: Linux
    pool:
      vmImage: 'ubuntu-latest'


- template: jobs/build.yml  # Template reference
  parameters:
    name: Windows
    pool:
      vmImage: 'windows-latest'
    sign: true  # Extra step on Windows only

Step Templates

Define a set of steps in one file and use it multiple times in another

# File: steps/build.yml

steps:

- script: npm install
- script: npm test
# File: azure-pipelines.yml

jobs:

- job: macOS
  pool:
    vmImage: 'macOS-latest'
  steps:

  - template: steps/build.yml # Template reference


- job: Linux
  pool:
    vmImage: 'ubuntu-latest'
  steps:

  - template: steps/build.yml # Template reference


- job: Windows
  pool:
    vmImage: 'windows-latest'
  steps:

  - template: steps/build.yml # Template reference
  - script: sign              # Extra step on Windows only

Variable templates

You can define a set of variables in one file and use it multiple times in other files.

In this example, a set of variables is repeated across multiple pipelines. The variables are specified only once.

# File: variables/build.yml
variables:

- name: vmImage
  value: windows-latest

- name: arch
  value: x64

- name: config
  value: debug
# File: component-x-pipeline.yml
variables:

- template: variables/build.yml  # Template reference
pool:
  vmImage: ${{ variables.vmImage }}
steps:

- script: build x ${{ variables.arch }} ${{ variables.config }}
# File: component-y-pipeline.yml
variables:

- template: variables/build.yml  # Template reference
pool:
  vmImage: ${{ variables.vmImage }}
steps:

- script: build y ${{ variables.arch }} ${{ variables.config }}

Pipeline resource

If you have an Azure pipeline that produces artifacts, your pipeline can consume the artifacts by using the pipeline keyword to define a pipeline resource.

resources:
  pipelines:

  - pipeline: MyAppA
    source: MyCIPipelineA

  - pipeline: MyAppB
    source: MyCIPipelineB
    trigger: true

  - pipeline: MyAppC
    project:  DevOpsProject
    source: MyCIPipelineC
    branch: releases/M159
    version: 20190718.2
    trigger:
      branches:
        include:

        - master
        - releases/*
        exclude:

        - users/*

Container resource

Container jobs let you isolate your tools and dependencies inside a container. The agent launches an instance of your specified container then runs steps inside it. The container keyword lets you specify your container images.

Service containers run alongside a job to provide various dependencies like databases.

resources:
  containers:

  - container: linux
    image: ubuntu:16.04

  - container: windows
    image: myprivate.azurecr.io/windowsservercore:1803
    endpoint: my_acr_connection

  - container: my_service
    image: my_service:tag
    ports:

    - 8080:80 # bind container port 80 to 8080 on the host machine
    - 6379 # bind container port 6379 to a random available port on the host machine
    volumes:

    - /src/dir:/dst/dir # mount /src/dir on the host into /dst/dir in the container

Repository resource

resources:
  repositories:

  - repository: common
    type: github
    name: Contoso/CommonTools
    endpoint: MyContosoServiceConnection

References

https://learn.microsoft.com/en-us/training/modules/integrate-azure-pipelines/2-describe-anatomy-of-pipeline https://learn.microsoft.com/en-us/training/modules/integrate-azure-pipelines/3-understand-pipeline-structure https://learn.microsoft.com/en-us/training/modules/integrate-azure-pipelines/4-detail-templates