Azure DevOps Pipelines: Practices for Scaling Templates

Developer Support

John Folberth explores what is considered “good practices” when looking to leverage one repository consisting of YAML templates in this blog series.


This article is part of a larger series on Azure DevOps Pipelines regarding leveraging YAML templating. As part of this series, I have had numerous requests or questions around what is considered “good practices” when looking to leverage one repository consisting of YAML templates.



The objective on what we are trying to achieve is create a structure by which we can create a task, job, stage, or variable template once and let all of our pipelines reuse the code. The intent is to achieve the following outcome:

  • Reduce pipeline creation time
  • Ease maintenance of pipelines
  • Provide an extra layer of security on who can create pipelines
  • Consistently be able to deploy various apps in a similar fashion


Template Expansion

To best understand and architect a YAML pipeline via templates have to understand how the template expansion process works. When we go to run our Azure DevOps Pipeline it will collect all the YAML templates and associated files, variables, etc… and expand it into ONE file. This is the azure-pipeline-expanded.yml file which can be downloaded post a job completion. This is important to understand from architecture and troubleshooting perspective that we can see that one file is submitted to Azure DevOps. All parameters being passed in must be available at the pipeline compilation time. Runtime variables can be runtime variables and will be retrieved at pipeline execution.


One Repository

In order to help achieve these goals it is recommended to store of the templates in a dedicated Git repository. Azure DevOps pipelines support this functionality via the resources block. By doing this all of our pipelines can retrieve their templates from one consolidate location. In addition, by putting these in a separate repository we can introduce another layer of security on who can update this repo.

Continue to John’s full post here.


Discussion is closed. Login to edit/delete existing comments.

  • JSON Bourne 0


  • Michael Taylor 0

    We use a dedicate templates repo for our builds but one big caveat to this approach though is that your build paths are changed. With a single checkout the build agent puts the “code” repo into the base artifact directory (or something like that). However when you have multiple repos that you check out then it cannot do that anymore so it puts them into nested folders. This messes up any code that assumes the source is in the artifact directory. Your tasks have to be multi-repo aware otherwise they won’t find the stuff they are looking for. This can complicate things and is annoying to debug.

    • John Folberth 0

      Are you publishing your code as artifacts as part of your pipeline? In the scenario above there is one repository with all the YAML code which is consumed, not published by the repository.

      The build should create a reusable artifact as part of the pipeline build. The YAML tasks will be consumed when the job is expanded.

      For your given scenario I usually run separate publish tasks for each folder. Thus it will load the artifacts into separate folders that will always have the same path relative to the agent.

Feedback usabilla icon