A closer look

In this blog I can finally take a deep look at the freshly rolled out functionality by Microsoft in Azure DevOps which allows users to convert their classical pipelines from the classical interface to YAML code and has removed the Button which could previously used to get separate snippets of pipeline as YAML code.

Why a new blog post?

In my previous blog I did a preview on the new functionality based on another blog by microsoft. At first I wanted to edit the old blog but as I played around with the functionality and my tool I quickly saw that this would deserve a separate post.

In this blog post I mainly will look at the new functionality in comparison to my own AzDoAPITools module.

I will do so on the following subjects:

  • General usage
  • Variables & Parameters
  • Schedules
  • Job(s) and dependancies
  • Task Groups / Steps
  • Other pipeline properties

In each section I will take the same pipeline which was featured in my first blog and compare the generated YAML files by both tools. I will then discuss overlap and differences which I will find when using both tools.

A fair warning: This will be a fairly long read!

The pipeline(s) which I compared?

While working on AzDoAPITools I created one pipeline to test my tool against for various components and traits. I would ever change the pipeline when I was working on different functionality to see how the tool did. When I was testing the Export to YAML functionality I noticed I could not handle all edge cases and so I created another pipeline because otherwise it would become this monstrosity of a pipeline which does not make sense at all.

Example Pipeline 1

Example Pipeline 1 is the most rich pipeline which I had during testing and features the following traits:

  • Single Job
  • Agent Job defers from using the pipeline specified Agent/Pool
  • Job has specific Agent demands to run on
  • Pipeline has specific Agent demands to run on
  • Nested Task Groups as tasks (Task group step wjich links to a Task Group which has a Task Group and a regular step as tasks)
  • 4 Schedules in various timezones and with included / excluded branches or with batchable property
  • Default build number format
  • Various types of variables:
    • Variable settable at Queue time
    • Secret Variable
    • Static Variable
    • Linked Variable Group
  • CI Triggers on both branches and Paths (include and exclude)

You can download the generated yml files in the following links because I want full transparancy:

Example Pipeline 2

This pipeline was created to test some edge cases like job dependancies, multi-job configuration and other pipeline properties. If i come up with additional cases I want to test they will go into this pipeline. This pipeline features the following traits:

  • Multiple jobs
    • Job 2 dependant on Job 1
    • Job 1 has a custom condition
    • Job 1 uses a different agent than the pipeline
    • Job 2 uses the agent inherited by the pipeline
    • Job 2 is set to use a multi-agent strategy
    • Job 2 has custom time-out settings
    • Job 3 is agentless
    • Job 3 has the succeededOrFailed() condition
  • No Triggers, Schedules or Variables
  • Build Number is non-default
  • Non-nested Task Group call (One tasks of the job is refering to a Task Group which has only non-task group steps)
  • Using a different Version of the Powershell Task (V1 old version)
  • Using a Custom Task from Marketplace

I included the agentless job because I wanted to see how my own tool behaves on this case. You can download the generated yml files in the following links because I want full transparancy:

General Usage

The Microsoft Export functionality is really intuitive. When you go to the summary of a classical pipeline you go to the ‘. . .’ elipsis menu on the top and select “Export to YAML”. DO NOT CLICK “Export” as it will give you a JSON formatted pipeline definition which you can use with the “Import” functionality for classical pipelines only.

So this makes it really easy to export your classical pipelines to YAML pipelines.

In comparison with my own tool you would need to be atleast a bit Powershell savvy and know your way around the Powershell terminal to install the module and then use the commands to actually list and convert your pipelines.

The scope of this conversion tool by microsoft is Build Pipelines only. Which I can understand because release definitions prove to be quite difficult to convert with their current YAML implementation. They first need to include some desperately needed features such as single deployment jobs which are manual before even considering a conversion tool. For AzDoAPITools it features the possibility to convert build pipelines like microsoft and the conversion of Task Groups to yaml templates. Also there is the option to expand Task Groups which are called in pipelines and other task groups and ofcourse the possibility to handle and convert multiple of both Task Groups and Pipelines at once.

As I have said in my previous blog I think the Microsoft tool is really aimed at smaller companies and / or small projects which often do not have a dedicated owner / admin for Azure DevOps. Most commonly those type of users will prefer ease of use above completeness and will probably not mind when they have to apply some manual steps after conversion to be able to use their generated code.

Variables and Parameters

And now onto the actual comparison of properties and what the Microsoft tool produces as usable YAML code. My approach will be to show both outputs per segment and then a comparison view to see the actual differences each tool produces. In this example I will take only Example Pipeline 1 because #2 does not have any.

The pipeline

So the pipeline uses various kinds of variables as well as a linked variable group. The variables listed are:

  • parametervar – variable which is settable at queue time. Default value “testvalue”
  • secretTestvalue – secret variable
  • statictestvar – static variable. Default value “staticvalue”

The produced YAML code

Microsoft “Export to YAML”
AzDoAPITools

The comparison

When I wrote my preview blog on the Export to YAML functionality Variables was mentioned as a shortcoming by Microsoft itself.

Because the functionality was not rolled out to my environment I had hoped that this would only concern the conversion of non-static variables to runtime parameters and that they would include static variables and variable groups. It makes sense not to export secret variables since in YAML they would end up as plain text.

Unfortunately Microsoft meant that they would not do ANY variable conversion when generating the YAML for a classical pipeline. They do include a comment section which shows you what variables were not exported.

Still I can understand (or at least try to understand) why they made this decision.

YAML pipelines are wrapped into a “Build Definiton” when you are creating a YAML pipeline. So you have two places where you can define variables which are either inside the actual produced YAML file or inside the Build Definition wrapper.

I feel that Microsoft did not want to make a choice for their users on where to put these variables and or if the users wanted to convert settable variables as runtime parameters.

To me personally my preference would be to have the variables inside the YAML pipeline as code (for the non-secret variables atleast) rather than in the wrapper where they might be less visible.

Schedules

Similarly to variables Microsoft acknowledged in their own blog post Schedules were not fully implemented.

What they mean with that is that Schedules in YAML pipelines are always expected in UTC timezone and they will not do that conversion for you.

I can understand this quite well. In classical pipelines you have the option to create your schedules in a different timezone and converting to a different timezone proved to be the greatest challenge for me when I was working on AzDoAPITools. Especially considering Daylight Savings Time (DST) in the equasion and the fact that you can mix and match various different timezones which can be different to the timezone the actual user is in. And to even add to that the possibility of shifting days when the UTC conversion would span across a different day than the one scheduled. A huge puzzle indeed…

Let’s See how Microsoft implemented this!

The pipeline

On Example Pipeline 1 I have created 5 different schedules in different timezones and different options:

  • Schedule 1 (Expected behavior: shift 1 hour to the left, no shifting of days, pick up properties)
    • 04:00 UTC + 1
    • runs on Saturday & Sunday
    • Batch Changes
    • Include “master” branch
  • Schedule 2 (Expected behavior: shift 1 hour to the left, no shifting of days, pick up properties)
    • 19:07 UTC + 1
    • runs on Monday through Saturday
    • Batch Changes
    • Include “master” branch
  • Schedule 3 (Expected behavior: no shifting of hours, no shifting of days, pick up properties)
    • 06:00 UTC
    • runs every day of the week
    • Do NOT Batch Changes
    • exclude “master” branch
  • Schedule 4 (Expected behavior: shift 9 hours to the left, days shift one to the left, pick up properties)
    • 08:00 UTC + 9
    • runs on every day except Saturday
    • Batch Changes
    • Include “master” branch
  • Schedule 5 (Expected behavior: shift 9 hours to the right, no shifting of days, pick up properties)
    • 04:00 UTC – 9
    • runs on Saturday & Sunday
    • Batch Changes
    • Include “master” branch

The produced YAML code

Microsoft “Export to YAML”
AzDoAPITools

The comparison

So this comparison clearly shows the different take on schedules.

Microsoft does take the schedules from the classical pipeline and creates YAML notation of it. It will literally take exactly as you entered in your chosen timezone and converts it as if it would be UTC. It will also leave the schedules days to the original. They clearly state this functionality works like this so let this be a good reminder that if you work with the Microsoft “Export to YAML” functionality that you will need to check your generated YAML afterwards and make corrections in times and days.

In AzDoAPITools there is conversion happening between the given timezone in the classical pipeline and the expected UTC timezone. This shows in Schedules 1 / 2 by shifting 1 hour to the left and on Schedules 4 / 5 by shifting by 9 hours left / right which also impacts the days on which they need to be scheduled in UTC (schedule 4 only as it shifts to the previous day). I think my tool also covers DST corrections if your configured timezone happens to be in DST at the time of conversion but time will tell if this is resillient enough for every user.

I guess this will be the main reason for Microsoft for not implementing it. Timezone conversion is hard especially when you can freely configure any timezone and with DST playing a part. This is really hard to foolproof and i’m sure that my tool will encounter some weird occurance one day which I did not think of. That said it is always wise to check the generated output for abnormalities but less modifications are needed if at all.

Furthermore I only see some syntactical differences which are most likely a matter of taste such as in Schedule 3 true vs “true” and a * in cron notation where I am specifying each day if the specified schedule runs on all days of the week.

In all a good effort by Microsoft, but like the variables this could mean some manual labor involved in correcting timezones if you so happen to not use UTC already. Especially when shifting days this could lead to confusion.

Job(s) and dependancies

Microsoft did not mention any specific behavior to jobs and dependancies between jobs so let’s get to it!

The pipeline(s)

For testing the Job properties I had to make 2 Example pipelines to test different properties.

Example Pipeline 1

In this first Pipeline we can see that I have 1 job mentioned. Do not pay attention to the actual steps this is about the job properties purely.

  • Pipeline properties

    • Uses Azure agent with image VS2017-Win2016
    • Default settings on timeout
    • Two pipeline demands:
      • pipelinetestdemand : Exists
      • pipelinetestdemand2 : Equal to “valuedemand2”
  • Job Properties

    • Custom Job Name (Agent job 1)
    • Agent defers from using the pipeline specified option and uses the Default Self-Hosted pool
    • Two Job Demands:
      • jobinlinedemand1 : Exists
      • jobinlinedemand2 : Equal to “inlinevalue1”

In practice this pipeline will not be able to run because I have no agents which meet all demands. This is purely an example on how the tools pick up these properties.

Example Pipeline 2

This Pipeline has multiple jobs and dependancies / conditions between them. Also I mix and match with agent / agentless jobs and Agent Jobs.

  • Pipeline Properties
    • Uses Azure agent with image VS2017-Win2016
    • Default settings on timeout
    • No pipeline demands
  • Job Properties
    • General:
      • Default job names
    • Job 1:
      • Agent defers from using the pipeline specified option and uses the Default Self-Hosted pool
      • No demands
      • Defaualt time-out options
      • Custom condition: “eq(variables[‘Build.SourceBranch’], ‘refs/heads/master’)”
    • Job 2:
      • Uses inherited agent from pipeline (VS2017-Win2016)
      • Depends on Job 1
      • Demand: DotNetFramework : Exists
      • parallelism: no multiplier, 25 agents
      • Timeout: 100
      • Job cancel timeout: 50
      • Has the default Succeeded() condition
    • Job 3:
      • Agentless Job
      • No dependancies (parallel)
      • Has the succeededOrFailed() condition so this should run regardless when the pipeline fails or not

The produced YAML code

Example Pipeline 1
Microsoft “Export to YAML”
AzDoAPITools
Example Pipeline 2
Microsoft “Export to YAML”
AzDoAPITools

The comparison

Example Pipeline 1

Now this is where we start to see the bigger differences between the tools. Let’s go over the different interpretations both tools do…

Microsoft is omitting the “pool” property for mentioning the pipeline default agent to use. Instead for each job they specify which agent is being used (better look at this in Example Pipeline 2). I’m feeling this is most likely flair and style. I chose to include the pool property and leave it out inside the job property when it would be inherited. It felt more true to how the original pipeline in a classical sense is built up.

Microsoft did include the display name whereas AzDoAPITools uses the displayname as the unique property name. I guess something to add is the actual display name as a property in my own tool.

Microsoft did include the demands for both the pipeline and the job demands. In AzDoAPITools this is a feature which was still on the ToDo list :) I did a small test which is not in this blog and added a second job which had no job specific demands to see whether the pipeline demands would still transfer over to that job and they do.

Overal no big shockers in this example, over to the more complex example.

Example Pipeline 2

In the second pipeline example we see more differences. I will skip the differences which were already mentioned in Example Pipeline 1 which are common between both (names & pool)

What shocks me immediately is that Microsoft did not implement any form of Conditions or Dependancies. In Job1 we had a custom dependancy and in Job 3 the succeededorFailed() condition. Both are not existant in Microsofts generated code. then for the dependancy between Job 2 and Job 1 (Job 2 being dependant on Job 1) this is also not in the generated code.. Since there was a comment about-multi-agent configuration being a thing I regenerated the pipeline w/o the multi-agent setting to rule out if that was a cause but that did not change things unfortunately

So where AzDoAPITools falls short on demands (for now) it does a better job at picking up dependancies and conditions. on the otherhand I have a new item on the ToDo list which is inclusion of the serverless agent type.

Both tools fall short on the multi-agent configuration option. Personally I have never worked with this functionality so it is hard for me tom implement properly.

I truly hope Microsoft will fix the dependncy / condition shortcomings, because if you would run this generated pipeline your jobs would be running in very weird circumstances and might just mess up your deployment in your production environment! if you rely heavily on these job concepts for splitting up jobs and use specific conditions when to and not to use jobs do not use the Microsoft option (at this time of writing)

Task Groups / Steps

Microsoft wrote in their Blog post that they would include the unrolling (as they called it) of used Task Groups so I will be comparing the functionality by Microsoft as if I would have the -ExpandNestedTaskGroup option flagged in AzDoAPITools. This will also unroll any nested Task Groups instead of calling them as YAML Templates.

Let’s dive in :)

The pipeline(s)

Again split into two pipelines because in Example Pipeline 1 I initially only had 1 job into consideration with a Nested Task Group. I did include a non-nested Task Group after the initial results I got from Microsofts tool to make sure I ruled out the nested equasion.

Example Pipeline 1

In this first Pipeline we can see that I have 1 job mentioned. The content of the steps is Default inline scripting with a Hello World. I created this example pipeline to see how the handling of Task Groups is done. In more complex environments Task groups are used in multiple layers to re-use components which occur in multiple pipelines and to apply some kind of versioning / Maintainability on task group changes.

  • Job 1 Steps:
    • Nested Task Group (testtaskgroup)
    • PowerShell Script – Build Definition step
  • Task Group (testtaskgroup)
    • PowerShell Script – Task Group Step
    • Nested Task Group (nestedtaskgroup )
  • Task Group (nestedtaskgroup)
    • PowerShell Script – Nested Task Group Task

The order of these Tasks Unrolled / Expanded is important. What I will be expecting is the following order:

  • PowerShell Script – Task Group Step
  • PowerShell Script – Nested Task Group Task
  • PowerShell Script – Build Definition step

This is because the first called task Group itself contains a new Task group which comes after the native step in the parent task group. I’m expecting all Powershell Tasks to be Version 2.

Example Pipeline 2

This Pipeline has multiple jobs and uses custom tasks (from MarketPlace) as well as non-default versioning. I Also included an empty job (Job 3). I also included a single Task Group which itself is not nested.

  • Job 1
    • Contains a Single Task Group (nestedtaskgroup)
  • Job 2
    • Powershell Script (V1)
    • Replace Tokens (V3 marketplace task)
  • Job 3

    • Contains no steps

No weird ordering is what I am expecting here other then the defined versions.

The produced YAML code

Example Pipeline 1
Microsoft “Export to YAML”
AzDoAPITools
Example Pipeline 2
Microsoft “Export to YAML”
AzDoAPITools

The comparison

Example Pipeline 1

So let’s go over the minor differences first. So there are some things syntax-wise which Microsoft chose to include such as:

  • always use “-checkout: self” as the first step
  • use “>” for multi-line scripts in YAML

Nothing too big there. Checkout self is the default option for pipeline jobs which is why I chose to omit it. And most of the times Microsoft will always omit default values to not clutter the output which I can understand. Nothing big there. Same goes for using “>” for multi-line scripts. In the case of AzDoAPITools this is probably because of the way “Convertto-YAML” handles multi-line.

But wait…. why am I not seeing the Task group Tasks in the Microsoft generated code??? This must be a mistake right?! I knew from the “View YAML” button that Task groups were not supported but Microsoft stated in their blog that unrolling of Task groups would be a supported feature… Did they ship this incomplete product and is this a bug? is this intended (I hope not), is it because I use nested Task Groups and it can’t see past the first Task Group? Let’s atleast exclude that factor in Example Pipeline 2!

Example Pipeline 2

I had hopes that the unrolling of Task groups did not work because I used nested Task Groups. unfortunately I was wrong…

WARNING!!!

at the time of writing PLEASE BE CAUTIOUS of using the Microsoft Export to YAML feature until this is fixed. Not unrolling Task groups is NOT OK! All your pipelines which use Task Groups WILL break because of this. Be very very cautious! Microsoft should disable this functionality until it is fixed!

So even in the scenario where I am using a single called Task Group which itself does not contain new Task Groups but rather just steps it is not unrolled. We can see that clearly in Job 1 which has no tasks in Microsofts generated code which clearly is visible in the GUI and the generated code by AzDoAPITools.

On Job 2 Microsoft did a good job. Both the V1 of the powershell task was refected as did the custom task replace tokens. I can see some small syntax differences to AzDoAPITools but nothing too serious. Previously external tasks could not be called by its shortcut name and needed to include the publishers name and extension name. It appears Microsoft has done some plumbing on their end to make that work which is great. I can simplify my scripts for grabbing tasks with that change :)

Not sure where the “persistentcredentials” trait comes from though. It seems to be related to the V1 usage of the PowerShell task but I could not find it as a property in the API…

Job 3 in Both scenario’s is empty although I have to give Microsoft credit on handling this better with an empty array. Something for my ToDo list :)

Other properties

I can be very short on this one. The only things I could find right out of the bat were the way the default option for pipeline build number (if left empty in the UI) will be omitted by Microsoft whereas I include it in the output. and the Pipeline pool which is always omitted by Microsoft. Very minor details.

I have not checked any 3rd party sources such as GitHub / Gitlab etc. which is something I will start doing when I am going to implement them inside my own tool.

The conclusion

I really really really wanted to say Microsoft did a good job on creating this tool based on my previous blog….

I had really hoped it would just be the schedules which were not converted and maybe some interpretation of how variables are converted and that it would be on par with my own created AzDoAPITools

Unfortunately I can’t. Not at the time of writing. Getting even the fundamentals wrong of their own platform being Task Groups, dependancies and conditions makes it so I can only say:

DO NOT USE THIS TOOL RIGHT NOW!

It will only leave you with half-baked YAML pipelines which need so much manual work that it defeats the purpose of an Export tool and might even cause unexpected behavior or damage your production environment (if you are being really reckless and put these into production straight after exporting)

I have really high hopes of Microsoft fixing these concepts ASAP and I will be the first to mention it here when they do. I will also hand in bugs on Microsofts backlog so they are aware of these defects.

I really wish I had better news. Seems like they shipped an incomplete feature. Feel free to try my tool AzDoAPITools instead or alongside Microsofts tool to get a better result meanwhile.

Thanks for reading this very long post!

Share This Story, Choose Your Platform!

Author

I am a senior Data engineer who specializes in implementing and optimizing DevOps strategies in Data projects embracing development teams to achieve their true potential.
December 8, 2020

Recent Posts

Categories

Archives