Azure DevOps is a great platform to perform automated Build and Release actions and it features two ways to do so.
When it was first released it featured GUI-Based pipelines which were already available when Azure DevOps was called TFS. Later when Azure DevOps was introduced YAML pipelines were also introduced. and rightfully so. Classical pipelines are caterpillars while YAML pipelines are the beautiful butterflies.
In this blog post I would like to take you into my journey in converting existing classical pipelines and the the tool I have created to do this automatically. In short The tool to mature your caterpillars into butterflies!
Classical pipelines (aka Build / Release Definition) have a rich GUI which have drag-n-drop look and feel and are easy to configure tasks. These cassical pipelines are easy for novice users and have the best compatibility of features in Azure DevOps. However they are hard to use in a “Create-Once-Use-Many” fashion and are hard to automatically generate. Which is something you strive for when doing automation and CI/CD.
YAML Pipelines are plain code, are stored in your repository and since it is YAML could theoretically be generated on the fly and feature a more rich templating functionality which allows for repeatable use. However these pipelines have a more steep learning curve since you will need to learn an additional language and syntax. Furthermore these pipelines in Azure DevOps are still being developed and as suchs you might not have all features available.
The major competitors of Azure DevOps like Jenkins, CircleCI, Bamboo & GitLab all use pipelines in the form of pipeline as code in YAML notation. I think we can safely say that YAML or pipeline as code is the industry standard.
This blog is not about comparisson of Classical vs YAML pipelines in Azure DevOps and which you should choose. If you want a good read on this please read this blog on medium or this blog by marcus felling which explains the topic very well.
The problem with converting from Classical to YAML Pipelines
Microsoft does not offer to convert Classical to YAML pipelines out of the box. In fact they suggest to convert the pipelines in a manual fashion. This means that for every pipeline you have you would need to:
- Create an empty YAML pipeline file;
- Open the pipeline (aka Build / Release Definition) which you want to convert;
- For every step you have click the button;
- Copy the presented YAML snippet;
- paste the copied step into your YAML file;
- rinse…repeat.
This poses two main problems:
- (Nested) Task Groups cannot be converted.
- Additional pipeline properties cannot be converted such as:
- Schedules
- Triggers
- Variables
- Variable Groups
- Agent Pools
- Jobs (should you multi-job pipelines)
- Dependancies
- Other pipeline specific properties
Why is this a problem?
Well for numerous reasons….
Imagine being a company who hopped on the CI/CD bandwagon some years ago using TFS. Experienced users will know that in TFS the only form of pipelines is classical and since they are GUI-based pipelines they are hard to template. Luckily Microsoft introduced the Task Group which basically enables to take a set of steps / tasks and bundle them together (with or without input parameters) to template re-usable steps. These Task Groups know some single digit version numbering (I will go into this topic another time because it deserves a separate blog) which makes it a decent take at templating for classical steps.
Still this poses a problem because you can’t convert a complete Task Group into YAML. Again for each task group you would have to follow the above mentioned steps and extract the separate steps. Then you would need to decide if you want to convert your task groups into YAML templates and call them from your main pipeline or if you want to extract the Task Group steps and paste them into your actual pipeline. To make problems worse you can nest Task Groups which means a Task Group can call another Task Group which means you would need to find the correct version of the nested Task Group extract all the steps and place them in the correct order. This can become quite complex quite soon when you have to do this manually. Especially since there is no limit to the amount of nest levels you can implement.
Then there is the issue with the other properties which are usually used in a pipeline. properties like Triggers, Schedules, variables are as important as your steps because they define the conditions under which the pipeline operate. In YAML pipelines you have two options for defining these properties:
- Inside the YAML file as plain code (see example)
- Inside the Build definition (the classical way)
The preferred way to do this is inside the YAML file since these properties should be part of your definition and are subject to change for prototyping. Having these properties in the definition (in Azure DevOps it wraps the YAML file with a skeleton called build definition) poses a problem because you can’t prototype changes to see if they fit your CI/CD solution without affecting all users. with YAML code you can just create a new branch to test your changes without affecting your mainline. So you definitely want to store these properties as YAML code.
Since Microsoft does not offer their users a way to extract these properties as YAML syntax users will need to open their classical definitions one by one and find the properties in the various tabs and find the correct YAML Syntax along with it. If you want to convert schedules you will need to use CRON syntax and convert your current schedule into UTC timezone. In short a lot of manual labor is involved.
Back to the initial problem stated. If you have invested a lot in Classical pipelines and / or (nested) Task Groups you might have dozens of pipelines which might have one or more nested task groups. If you have read above paragraphs this means a lot of manual labor and this process is very error prone.
When I was working at my customer we had just finished our migration from TFS to Azure DevOps and from TFVC to GIT. We had a stable environment and wanted to take the next step to start using YAML pipelines instead of Classical Pipelines to harness the full possibility of Pipeline as Code along with Infrastructure as Code.
We started with assessing our current situation:
- 35 Build Definitions
- 30 Release Definitions
- 80 Task groups (in various nest levels)
Since YAML was new to us we decided to do a small proof of concept using the conversion guide by Microsoft. We created one / two pipelines and struggled a lot with the syntax (YAML is unforgiving and trust me you will cuss at it A LOT if you are starting out because of all the unexpected syntax errors).
We soon saw that converting these pipelines one-by-one, step-by-step, scouring through our library of nested Task Groups finding the correct version which was being referenced again converting every step referenced step-by-step into a YAML template would not become a happy conversion for us.
Surely there would be tooling for this right? We searched quite thoroughly and could not find anything to convert our pipelines. So what will a good automation minded person do next: Create a tool yourself ofcourse!
The Solution
I was already quite an adept at using the Azure Devops REST API for automating various aspects of using and maintaining Azure DevOps and quite adept at PowerShell. I had already created some modules and always closely look at how other users on GitHub create PowerShell modules and learn from them.
I wanted to give myself a real challenge and create a tool which could automatically convert our pipelines and not only steps, but also other properties and take into account Task Groups.
The end result was the tool i published called AzDoAPITools (PSGallery / Github). I agree the name does not directly showcase this tool is primarily about converting pipelines but please allow me to explain. This module should in the end incorporate the conversion tool as aswell as my previously made scripts. These scripts which i have created previously to automate certain parts of Azure DevOps are still scripts and need to be refactored into this Powershell module (a lot of wrapping and plumbing has been done in the module). I’m also closely looking at the splendid VSTeam module which does a great job at querying and changing Azure DevOps. The intention of AzDoAPITools is to offer more than just converting pipelines but it was the intial functionality which started the module but this takes time.
How does it work?
In this blog I will only touch on the surface of the tool and in a part 2 i will go into details of how the tool does it’s job.
In short the module works like this:
- It grabs one or more Task Groups / Build Definitions (Release definitons are not supported yet)
- it will iterate over each Task Group / Build Definition and gather the following properties:
- Agent Pools
- Triggers
- Schedules
- Variables (queueable variables become runtime parameters)
- Variable Groups
- Input Parameters (Task group only)
- Jobs (multi-job build) & dependancies
- Steps
- Other Build Definition specific properties
- For each step found it will:
- Determine if it is a step or called Task Group:
- Task Group: iterate over each found step inside the task group and Expand / Template call. if another Task Group is found it will go one level deeper until it goes back to the original loop.
- Step: Format step
- Determine inputs for the step called
- Format used variables and correct syntax $() versus ${} for non predefined variable in Task groups
- Determine if it is a step or called Task Group:
- Continue with next step
- Output result as YAML file or Powershell Object for later use.
The tool will take care of YAML syntaxing, Converting timezones for schedules and properly extracting every step used in a build definition in the correct order.
Basically it feels like automatically pressing the Button for a single pipeline / Task Group in succession pasting the results in the right order along with gathering other important properties of your build definition
An example
In my test environment where i do a lot of proof of concepts as well as create these tools I have a classical Build Definition which I want to convert to YAML
As you can see my Build Definition contains one job and two steps:
- Call a Task group Called “testtaskgroup”
- Call a powershell task called “Build Definition Step”
- The Task Group “testtaskgroup” also contains two steps:
- A powershell Task Called “Task Group Step”
- Call a Task group called “nestedtaskgroup”
Inside the Task group “nestedtaskgroup” I have one step:
- Call a Powershell Task Called “Nested Task group Task”
So when I would want to convert purely the steps into a YAML pipeline the following order would be applicable (for ease of this blog i’m assuming we are expanding Task Groups into the end YAML file):
- Call a PowerShell task called “Task Group Step”
- Call a PowerShell task called “Nested Task Group Step”
- Call a PowerShell task called “Build Definition Step” (mind that the Task group is called before the PowerShell task inside the Definition)
On to the other properties:
I’m skipping the Schedules for this blog post since they are quite complex and would make this into a too long of a read. What we see here is:
Variables:
- parametervar – settable at queue time
- secrettestvalue – secret variable (will be skipped because of security reasons)
- staticvar – static valued var
Variable Groups:
- calling testgroup
Triggers:
- CI trigger
- Batch Changes
- Include Branch: Master Branch
- Include Path: Pathtoinclude
- Exclude Path: Pathtoexclude
The steps in AzDOAPITools
After installing and setting up the module you need to create an object or array of objects of each pipeline which you want to convert.
Either create an array of names and run:
Or:
Depending on if you want to get your definition by its Unique ID or by its Name. If you don’t know the names of your Build Definitions you can run:
To have an array returned of all available Build Definition names used.
The result of either Get-AzDoAPIToolsDefinitionsTaskGroupsByNamesList or Get-AzDoAPIToolsDefinitionsTaskGroupsByID which in this example we set to $definitionstoconvert will be used in the next step to actually convert the pipeline to YAML
If you would inspect $definitionstoconvert you would see an array of PSObjects which contain metadata for AzDoAPITools to ingest the Azure DevOps REST API:
Next Step is to get the YAML syntax of the Definitions which you just pulled from REST API:
I am using Get-AzDoAPIToolsDefinitionAsYAMLPrepped to iterate over the definition get all the steps and properties and convert them into YAML syntax.
I’ve used -ExpandNestedTaskGroups to expand found Task Groups as single steps in the end YAML file. If ommitted it will refer to Task Groups as a template.
Furthermore I have supplied the -Outputasfile switch along with -Outputpath to indicate to the module that I want a YAML file created named after the Pipelines name “Example-Pipeline.yml” inside the path specified in -Outputpath. If the path does not exist you will be prompted to create it so make sure you have sufficient rights. If -Outputasfile is ommitted This function returns a PSObject with all YAMLPrepped build definition which can be used for later use in PowerShell (note this is not converted to YAML yet).
The result
Generated YAML File
Explanation
After generating the YAML file you would need to create a new pipeline which is yaml based, point it towards the same repository as the original pipeline and make sure to add any secret variables which were skipped. after that you should disable the originating pipeline to not get any conflicts and make sure the YAML pipeline is running fine.
This pipeline is a very simple example of what is possible with AzDoAPITools. Converting this pipeline and iterating over the (nested) Task Groups took less than a second.
I was able to convert the 30 Build Pipelines and the 80 Task groups in around 2-3 minutes.
How to get started?
Using AzDoAPITools will save your company hundreds of hours if you happen to have a large collection of pipelines and Task Groups
The module is still being expanded with Features and properties of Build Definitions which were not used in the project I was working on.
The tool is Open Source and hosted under the MIT license. To get started open a PowerShell terminal and type:
You will need the following modules installed:
- PowerShell-YAML (since PowerShell has no native support for PowerShell)
This dependancy is only needed if you want to export the converted pipeline as a *.yml file.
You can find the Module on PowerShell gallery and find extensive documentation on GitHub. Should you run into issues, please raise a bug there and if you would like to see a particular feature contact me there. Since it is Open Source contributions are accepted.
My Next Blog will go into more details on how to install and use the module.
If you have any questions in general about DevOps and or CI/CD please contact me. I offer consultancy & Training.
Update November 12th 2020
It turns out that Microsoft has finally decided to improve the experience in converting the conversion of Classical Pipelines to YAML Pipelines in Sprint #178 of Azure DevOps.
At the time of writing this feature is still being rolled out and I am eagerly awaiting when it is deployed to my testing environment to compare the new functionality with my tool.
The description of the new functionality is at the very least vague so i’m not sure how this improvement will turn out. I hope for the best although I feel that having an alternative that handles multiple pipelines / Task Groups / Release Definitions will still be a viable product.
Ofcourse I will be writing a new blog to look at the new feature and compare it with my tooling.