Converting Classic Azure DevOps Pipelines to YAML

Premier Developer

Premier

In this post, App Dev Manager Guru Satish Piduru and Premier Field Engineer Aaron Perry explains how to convert classic Azure DevOps Pipelines to the new YAML pipeline format.


YAML based Pipelines are a great new feature to Azure DevOps that enables you to configure your CI/CD strategy as code, where, the Pipeline definition lives alongside and together with your code. This enables your DevOps teams to take advantage of pull requests, code reviews, history, branching, templates and much more. If you are completely new to Azure DevOps and want to start with YAML, please see our Azure Pipelines documentation for great getting started guides and examples. However, if you are familiar with the Classic Pipeline GUI and are looking to get started with YAML and convert some of your existing Pipelines, this blog should help you get familiar with the concepts, patterns, and syntax that will enable you to convert any Pipeline.

Note: Conversion of complex builds will require additional work which is documented well in our YAML schema documentation. There are also new features being developed rapidly for Azure DevOps and YAML such as Multi-Stage Pipelines and more.

Getting started

We will be using the Azure DevOps Lab for Continuous Integration to setup a continuous integration(CI) pipeline using the classic editor that can then be converted to YAML. If you would like to follow along, please complete the steps listed in the lab to create a classic pipeline that can be converted to YAML. Be sure to follow the steps in the prerequisite instructions that use the Azure DevOps Demo Generator to automatically import all of the required items into a new project for your Azure DevOps organization. If you do not already own an organization, you can create one for free directly in the steps listed above. Once you have completed the short lab, you can follow the steps below to continue with this guide. If you want to use your own separate pipeline, the steps and patterns should be the same but may require additional changes based on your specific requirements.

Once the lab is complete, your pipelines “recent” tab should look something like the image below.

Start the conversion

The conversion process involves two steps. Step1 is to create a new YAML pipeline from scratch and Step2 would be to copy the configuration from the classic pipeline to the new YAML pipeline and then make appropriate edits to make it work.

Step 1: Create a new YAML Pipeline

Click the New Pipeline button in the top right corner of the page. You should see a creation wizard that starts on the ‘Where is your code?’ segment like the below image.

Where is your code?

If you are following the steps from the lab, you should select Azure Repos Git. If you are using your own repo you will need to select and connect to that repo. On the Select tab select the appropriate repo. In our case, it will be the PartsUnlimited repo. On the configure tab, if you are creating a new Pipeline you should select the appropriate template for it. For the objectives of this blog, we will be replacing the contents of this file in the next step, so the selection is largely irrelevant except for the trigger and possibly the comments at the top. On the next step you should be looking at a text editor that looks like the image below.

YAML Editor

For simplicity, we will edit the YAML file in the text editor directly before saving. If you prefer, you can use your favorite editor and save the file to the repo.

Step 2: Open the existing Classic pipeline for editing

From the left navigation, select Pipelines and right click and select open in a new tab to open your Pipelines View in a new tab. We can either open the CI pipeline and hit the Edit button, or hover over the pipeline from this view and click the ellipses … to view/select the Edit option from the context menu. From here, you should see your pipeline settings and steps. For this walkthrough, we are going to select the Agent job 1 step to view the Agent Job settings- you should be looking at a window like the image below.

Note: PartsUnlimited Classic pipeline has “Enable Continuous Integration” checked under the “Triggers” tab. Any changes/updates including adding the YAML file to the repo will trigger and run the CI pipeline. To prevent this from happening, uncheck the “Enable Continuous Integration”.

You could also choose to create a new branch after you select Save and Run from the YAML editor so that you can make changes to the file without triggering your old Classic Pipeline.

Typically, after a conversion I rename, disable, and remove permissions from the old Pipeline to effectively archive it or just delete it completely.

Edit Pipeline

From this window, select the View YAML link in the top right section of the Agent job 1 step. When you click the link, a window should pop up with the YAML of the Agent job like shown in the below image.

A screenshot of a social media post Description automatically generated

Click “Copy to Clipboard” button to copy the YAML to clipboard. Go back to your original browser tab that has the previously created YAML file. Highlight everything starting at line 9 where it says ‘pool:’ and hit paste to replace the rest of the contents of the file with the YAML copied to the Clipboard from the classic Pipeline. Your open editor should now look like the image below.

Converted YAML

You will probably notice some warnings about the variables used in our code. This is due to the way the YAML is created in the View YAML link. Every converted Pipeline will require some editing to work correctly. In this case, we must declare some variables in the code that were previously either set at runtime or declared elsewhere in the GUI of the classic editor. I will be explaining the changes here and why they are made, but if you wish to just copy the fully converted classic Pipeline YAML contents you can find them here:

# Script 1
# ASP.NET
# Build and test ASP.NET projects.
# Add steps that publish symbols, save build artifacts, deploy, and more:
# https://docs.microsoft.com/azure/devops/pipelines/apps/aspnet/build-aspnet-4

trigger:
- master

pool:
  name: Azure Pipelines
  vmImage: 'vs2017-win2016'
  demands:
  - msbuild
  - visualstudio
  - vstest

variables:
  Parameters.solution: '**/*.sln'
  BuildPlatform: 'Any CPU'
  buildConfiguration: 'Release'
  Parameters.ArtifactName: 'Drop'

steps:
- task: NuGetToolInstaller@0
  displayName: 'Use NuGet 4.4.1'
  inputs:
    versionSpec: 4.4.1

- task: NuGetCommand@2
  displayName: 'NuGet restore'
  inputs:
    restoreSolution: '$(Parameters.solution)'

- task: VSBuild@1
  displayName: 'Build solution'
  inputs:
    solution: '$(Parameters.solution)'
    msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactstagingdirectory)\\"'
    platform: '$(BuildPlatform)'
    configuration: '$(BuildConfiguration)'

- task: VSTest@2
  displayName: 'Test Assemblies'
  inputs:
    testAssemblyVer2: |
     **\$(BuildConfiguration)\*test*.dll
     !**\obj\**
    platform: '$(BuildPlatform)'
    configuration: '$(BuildConfiguration)'

- task: PublishSymbols@2
  displayName: 'Publish symbols path'
  inputs:
    SearchPattern: '**\bin\**\*.pdb'
    PublishSymbols: false
  continueOnError: true

- task: PublishBuildArtifacts@1
  displayName: 'Publish Artifact'
  inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)'
    ArtifactName: '$(Parameters.ArtifactName)'
  condition: succeededOrFailed()

Code changes made

The first change we need to make is to fix the variable issues that are listed in the comments copied from the View YAML link. You could hard code the variable’s values in each required task, but the recommended practice to create a variables section that contains the values that can be used through the Pipeline and easily changed between different pipelines etc. If you want more information on variables please view our Defining Variables and Predefined Variables documentation. Do not place secret variables in your YAML file. See the Secret Variables section of the Defining Variables documentation for instruction on how to use them and other variable types. I created a variables section with variables for each of the listed issues and pasted them between the ”Pool” and “Steps” sections and removed the error statements.

variables:
  Parameters.solution: '**/*.sln'
  BuildPlatform: 'Any CPU'
  buildConfiguration: 'Release'
  Parameters.ArtifactName: 'Drop'

The next change that needs to be made is to the pool. This solution requires a specific Microsoft-hosted agent that has Visual Studio 2017 installed. In our classic editor these changes would have been made in the Pipeline settings, so it did not get converted in our Agent Job View YAML link. Since the classic editor required the agent vs2017-win2016 we need to add the code highlighted below to the pool segment to ensure we have the correct agent. You can find more about the hosted agents and the YAML language required for each agent in our Hosted Agents documentation.

pool:
  name: Azure Pipelines
  vmImage: 'vs2017-win2016'
  demands:
  - msbuild
  - visualstudio
  - vstest

Your YAML file should now be complete. It is always a good practice to document information about the Pipeline at the top of the YAML file. In this case, I just added that it was converted from a previous Classic Pipeline. The trigger portion of the YAML file is what sets our Pipeline to be a CI Pipeline that will run when any change is made to the ‘master’ branch. Please see our Triggers documentation for additional methods of setting triggers.

Note: if you need help adding a new task or having trouble with the format or syntax of a task you are converting, take a look at the “show assistant” button in the top right of the editor frame. This opens a Tasks GUI that enables you to search for and add tasks similar to the classic editor. Here, you can select a task, enter the values and then add it to your YAML file. This is very helpful for people familiar with the Classic editor but new to YAML.

We are now ready to check in our file and test our Pipeline. Click Save and Run in the top right of the window to commit your file to your repo and run your first YAML pipeline.

Remember: as stated in step-2 of this blog, if you have not previously unchecked “Enable Continuous Integration” under the “Triggers” tab of the old Classic Pipeline, checking in the new YAML file will trigger a build on the repo

After it runs you should see the following:

Completed Run

Your CI build should now be setup to be triggered on any change to your repo, including editing the YAML file itself. To test, open and edit any file in the repo and commit the change. I simply navigated to the index.cshtml file in the repo as shown in the image below and edited the ViewBag.Title with some extra text and committed the change to the repo.

Edit page

After you hit commit navigate back to the Pipelines View to verify your CI trigger worked and your Pipeline is running. If so Congratulations, you have a working converted CI Pipeline!

In Conclusion

Hopefully, this basic walkthrough enables you to begin converting and creating new Pipelines in YAML to take advantage of all its features. Microsoft Pipelines documentation has many more examples, scenarios, and getting started guides. You can also find additional documentation and samples in azure-pipelines-yaml GitHub repo.

6 comments

Comments are closed.

  • Avatar
    Gintautas Petkevicius

    Hey Guys,
    I’m really missing ability to test yaml pipelines locally first before committing it. I wonder if you are have the same problems too? Converting complex pipelines to yaml can be very tedious job, and you want to test it first before the commit. Could you please share your thoughts and experience on this topic?

    • Avatar
      Aaron PerryMicrosoft logo

      Hi Gintautas,

      There is no great way to test locally before making a commit. Currently there is too much logic that cannot happen client side only.

      Typically what I do is create a branch based on the work you want to make changes to. Then create a new pipeline to point to the branch with those YAML files. This enables you to work on changes to your YAML pipeline without team disruption, but there is no great way to fully test locally before committing. Make sure to hace CI/CD selected or not selected if you want changes you make directly in the web editor to kick off your pipeline. You can find a a request in our community here that suggest similar experiences to my own:

      https://developercommunity.visualstudio.com/content/idea/366517/ability-to-test-yaml-builds-locally.html

      There you can find information about a server side API that will check syntax and assist with other helpful changes. Hopefully this helps you through the conversion process as you move forward with YAML.

  • Avatar
    mike schellenberger

    I already find the Classic UI difficult to work with when you don’t do DevOps fulltime. There are so many options to choose from and things to select BUT at least you get prompted for and are typically provided lists to choose from.

    I see no way in hell that I will be able to do this from a text file where I have to know all the choices and options and tasks and jobs and sources and outputs that are available.

  • Avatar
    Taylor Bogle

    The ability to generate the YAML from an existing Classic pipeline only works for the build half of the process. It is nice and easy(ish) to convert classic build pipelines to YAML , but why is there not an equivalent on the release side? From what I can tell, although Microsoft has built out deployment capability through a YAML pipeline, the lions share of focus has been on building, not releasing. As a devops engineer one of my primary responsibilities is to oversee how code gets deployed to real environments. We need to have the right controls in place, approvals, permissions, visibility, checks…. etc. I love the idea of being able to make this easily repeatable across the enterprise by creating a standard template and using it across all applications for deployment, but it just seems like the product isn’t there yet.