'Is there a tool to validate an Azure DevOps Pipeline locally?
When making changes to YAML-defined Azure DevOps Pipelines, it can be quite tedious to push changes to a branch just to see the build fail with a parsing error (valid YAML, but invalid pipeline definition) and then try to trial-and-error fix the problem.
It would be nice if the feedback loop could be made shorter, by analyzing and validating the pipeline definition locally; basically a linter with knowledge about the various resources etc that can be defined in an Azure pipline. However, I haven't been able to find any tool that does this.
Is there such a tool somewhere?
Solution 1:[1]
UPDATE: This functionality was removed in Issue #2479 in Oct, 2019
You can run the Azure DevOps agent locally with its YAML testing feature.
- From the microsoft/azure-pipelines-agent project, to install an agent on your local machine.
- Then use the docs page on Run local (internal only) to access the feature that is available within the agent.
This should get you very close to the type of feedback you would expect.
Solution 2:[2]
FYI - this feature has been removed in Issue #2479 - remove references to "local run" feature
Hopefully they'll bring it back later considering Github Actions has the ability to run actions locally
Solution 3:[3]
Azure DevOps has provided a run preview api endpoint that takes a yaml override and returns the expanded yaml. I added this support to the AzurePipelinePS powershell module. The command below will execute the pipeline with the id of 01 but with my yaml override and return the expanded yaml pipeline.
Preview - Preview Service: Pipelines API Version: 6.1-preview.1 Queues a dry run of the pipeline and returns an object containing the final yaml.
# AzurePipelinesPS session
$session = 'myAPSessionName'
# Path to my local yaml
$path = ".\extension.yml"
# The id of an existing pipeline in my project
$id = 01
# The master branch of my repository
$resources = @{
repositories = @{
self = @{
refName = 'refs/heads/master'
}
}
}
Test-APPipelineYaml -Session $session -FullName $path -PipelineId $id -Resources
$resources
Solution 4:[4]
I can tell you how we manage this disconnect.
We use only pipeline-as-code, yaml.
We use ZERO yaml templates and strictly enforce one-file-pr-pipeline.
We use the azure yaml extension to vscode, to get linter-like behaviour in the editor.
Most of the actual things we do in the pipelines, we do by invoking PowerShell, that via sensible defaulting also can be invoked in the CLI, meaning we in essence can execute anything relevant locally.
Exceptions are Configurations of the agent - and actual pipeline-only stuff, such as download-artifact tasks and publish tasks etc.
Let me give some examples:
Here we have the step that builds our FrontEnd components:

Here we have that step running in the CLI:
I wont post a screenshot of the actual pipeline run, because it would take to long to sanitize it, but it basically is the same, plus some more trace information, provided by the run.ps1 call-wrapper.
Solution 5:[5]
A pipeline described with YAML, and YAML can be validated if you have a schema with rules on how that YAML file should be composed. It will work as short feedback for the case you described, especially for syntax parsing errors. YAML Schema validation might be available for almost any IDE. So, we need:
- YAML Schema - against what we will validate our pipelines
- An IDE (VS Code as a popular example) - which will perform validation on the fly
- Configure two of the above to work together for the greater good
The schema might be found from many places, for this case, I'll suggest using https://www.schemastore.org/json/ It has Azure Pipelines schema (this schema contains some issues, like different types of values comparing to Microsoft documentation, but still cover the case of invalid syntax)
VS Code will require an additional plug-in to perform YAML text validation, there are also a bunch of those, who can validate schema. I'll suggest try YAML from RedHat (I know, the rating of the plugin is not the best, but it works for the validation and is also configurable)
In the settings of that VS Code plugin, you will see a section about validation (like on screenshot)
Now you can add to the settings required schema, even without downloading it to your machine:
"yaml.schemas": {
"https://raw.githubusercontent.com/microsoft/azure-pipelines-vscode/v1.174.2/service-schema.json" : "/*"
}
Simply save settings and restart your VS Code. You will notice warnings about issues in your YAML Azure DevOps Pipeline files (if there is any). Failed validation for purpose on the screenshot below:
See more details with examples here as well
Solution 6:[6]
Such tool does not exists at the moment - There are a couple existing issues in their feedback channels:
- Github Issues - How to test YAML locally before commit
- Developer Community - How to test YAML locally before commit
As a workaround - you can install azure devops build agent on your own machine, register as its own build pool and use it for building and validating yaml file correctness. See Jamie's answer in this thread
Of course this would mean that you will need to constantly switch between official build agents and your own build pool which is not good. Also if someone accidentally pushes some change via your own machine - you can suffer from all kind of problems, which can occur in normal build machine. (Like ui prompts, running hostile code on your own machine, and so on - hostile code could be even unintended virus infection because of 3rd party executable execution).
There are two approaches which you can take:
- Use cake (frosten) to perform build locally as well as perform building on Azure Devops.
- Use powershell to perform build locally as well as on Azure Devops.
Generally 1 versus 2 - 1 has more mechanics built-in, like publishing on Azure devops (supporting also other build system providers, like github actions, and so on...).
(I by myself would propose using 1st alternative)
As for 1: Read for example following links to have slightly better understanding:
Search for existing projects using "Cake.Frosting" on github to get some understanding how those projects works.
As for 2: it's possible to use powershell syntax to maximize the functionality done on build side and minimize functionality done on azure devops.
parameters:
- name: publish
type: boolean
default: true
parameters:
- name: noincremental
type: boolean
default: false
...
- task: PowerShell@2
displayName: invoke build
inputs:
targetType: 'inline'
script: |
# Mimic build machine
#$env:USERNAME = 'builder'
# Backup this script if need to troubleshoot it later on
$scriptDir = "$(Split-Path -parent $MyInvocation.MyCommand.Definition)"
$scriptPath = [System.IO.Path]::Combine($scriptDir, $MyInvocation.MyCommand.Name)
$tempFile = [System.IO.Path]::Combine([System.Environment]::CurrentDirectory, 'lastRun.ps1')
if($scriptPath -ne $tempFile)
{
Copy-Item $scriptPath -Destination $tempFile
}
./build.ps1 'build;pack' -nuget_servers @{
'servername' = @{
'url' = "https://..."
'pat' = '$(System.AccessToken)'
}
'servername2' = @{
'url' = 'https://...'
'publish_key' = '$(ServerSecretPublishKey)'
}
} `
-b $(Build.SourceBranchName) `
-addoperations publish=${{parameters.publish}};noincremental=${{parameters.noincremental}}
And on build.ps1 then handle all parameters as seems to be necessary.
param (
# Can add operations using simple command line like this:
# build a -add_operations c=true,d=true,e=false -v
# =>
# a c d
#
[string] $addoperations = ''
)
...
foreach ($operationToAdd in $addoperations.Split(";,"))
{
if($operationToAdd.Length -eq 0)
{
continue
}
$keyValue = $operationToAdd.Split("=")
if($keyValue.Length -ne 2)
{
"Ignoring command line parameter '$operationToAdd'"
continue
}
if([System.Convert]::ToBoolean($keyValue[1]))
{
$operationsToPerform = $operationsToPerform + $keyValue[0];
}
}
This will allow to run all the same operations on your own machine, locally and minimize amount of yaml file content.
Please notice that I have added also last execution .ps1 script copying as lastRun.ps1 file.
You can use it after build if you see some non reproducible problem - but you want to run same command on your own machine to test it.
You can use ` character to continue ps1 execution on next line, or in case it's complex structure already (e.g. @{) - it can be continued as it's.
But even thus yaml syntax is minimized, it still needs to be tested - if you want different build phases and multiple build machines in use. One approach is to have special kind of argument -noop, which does not perform any operation - but will only print what was intended to be executed. This way you can run your pipeline in no time and check that everything what was planned to be executed - will gets executed.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | KyleMit |
| Solution 2 | KyleMit |
| Solution 3 | |
| Solution 4 | |
| Solution 5 | Sysanin |
| Solution 6 |



