Introduction
In our recent engagement, our team crafted reusable and extensible pipelines designed to enhance consistency, improve readability, and drive developer productivity. In this blog, we share our insights to help you elevate your own pipeline efficiency. Continuous Integration and Continuous Deployment (CI/CD) pipelines are critical for ensuring efficient and reliable code deployment. As projects grow, these pipelines can become increasingly complex, making them difficult to manage and extend.
By adopting the following three approaches, you can streamline pipeline management, lower operational overhead, and boost productivity.
- Modularize CI/CD Pipelines Using Reusable Workflows
- Convert Bicep Outputs to JSON for Better Integration
- Utilize Bicep Outputs in Post-Infrastructure Deployment Steps
Let’s dive into each of these approaches with detailed explanations and code examples.
1. Modularize CI/CD Pipelines Using Reusable Workflow Pattern
GitHub Actions supports reusable workflows, allowing one workflow to call another. By modularizing your CI/CD pipelines using this pattern, you enhance maintainability and readability across your workflows.
Benefits of Modularization
- Maintainability: Update individual components without affecting the entire pipeline.
- Reusability: Share common workflows across multiple projects or repositories.
- Clarity: Simplify complex pipelines by breaking them into smaller, manageable parts.
a. Modularized Continuous Integration (CI) Pipeline
Here’s how to create a modularized CI pipeline using GitHub Actions to validate code on pull requests or merges to protected branches.
name: Continuous Integration
on:
workflow_call: # allow to be called from other workflows
workflow_dispatch: # allow triggering manually
pull_request: # on any PR
jobs:
linting:
uses: ./.github/workflows/linting.yaml
test_dotnet:
uses: ./.github/workflows/test_dotnet.yaml
test_python:
uses: ./.github/workflows/test_python.yaml
Each job references a separate re-usable workflow:
linting.yaml
: Contains steps for code linting.test_dotnet.yaml
: Contains steps for .NET unit tests.test_python.yaml
: Contains steps for Python unit tests.
Example Linting Workflow (linting.yaml
)
name: Linting
on:
workflow_call:
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Run Linting
run: |
# Add linting commands here
echo "Linting code..."
b. Modularized Continuous Deployment (CD) Pipeline
Similarly, create a modularized CD pipeline to deploy the application. This workflow is divided into several jobs, each responsible for a specific deployment task such as parsing environment information, deploying infrastructure, or pushing Docker images.
This modular approach ensures that each deployment step in the CD pipeline is self-contained and can be reused or triggered manually for a partial deployment if required.
name: Continuous Deployment
on:
workflow_dispatch: # allow triggering manually
inputs:
# manually triggered runs have GITHUB_REF_NAME as a branch, not a tag so
# we need to manually map to an environment and suffix
environment_name:
description: "GitHub deployment environment to pull variables and secrets from"
required: true
type: environment
push:
branches:
- releases/**
jobs:
ci:
uses: ./.github/workflows/ci.yaml
detect_env:
needs: [ci]
runs-on: ubuntu-latest
outputs:
environment_name: ${{ steps.parse_ref.outputs.environment_name }}
build_version: ${{ steps.parse_ref.outputs.build_version }}
steps:
- id: parse_ref
name: Parse Environment from Git Reference
run: |
build_version="${{ github.sha }}"
if [ ! -z "${{ inputs.environment_name }}" ]; then
environment_name="${{ inputs.environment_name }}"
echo "Using specified environment: $environment_name"
else
environment_name="${{ github.ref_name }}"
echo "Derived environment from branch: $environment_name"
fi
echo "environment_name=$environment_name" >> $GITHUB_OUTPUT
echo "build_version=$build_version" >> $GITHUB_OUTPUT
deploy_infra:
needs: [detect_env]
uses: ./.github/workflows/deploy_infra.yaml
secrets: inherit
with:
environment_name: ${{ needs.detect_env.outputs.environment_name }}
build_version: ${{ needs.detect_env.outputs.build_version }}
docker_build_and_push:
permissions:
contents: read
packages: write
id-token: write
needs: [deploy_infra]
uses: ./.github/workflows/docker_build_and_push.yaml
secrets: inherit
with:
environment_name: ${{ needs.detect_env.outputs.environment_name }}
build_version: ${{ needs.detect_env.outputs.build_version }}
deploy_app:
needs: [docker_build_and_push]
uses: ./.github/workflows/deploy_app.yaml
secrets: inherit
with:
environment_name: ${{ needs.detect_env.outputs.environment_name }}
Each job in the CD pipeline focuses on a specific task:
- ci: Runs the CI workflow to ensure code quality before deployment.
- detect_env: Determines the deployment environment.
- deploy_infra: Deploys infrastructure using Bicep templates.
- docker_build_and_push: Builds and pushes Docker images.
- deploy_app: Deploys the application to the target environment.
2. Convert Bicep Outputs to JSON for Better Integration
When deploying Azure infrastructure with Bicep, you often need to pass outputs (e.g. resource names, endpoint URIs, etc) to subsequent steps or workflows to configure the newly deployed infrastructure.
Highlight: Storing Bicep Outputs for Better Integration
By capturing Bicep outputs and storing them as JSON artifacts, you can easily pass infrastructure details to other jobs or workflows. This method will streamline your CD pipeline.
Below is an easy-to-understand code example demonstrating how to deploy infrastructure using Bicep and capture its outputs for better integration.
Infrastructure Deployment Workflow (deploy_infra.yaml)
name: Deploy infrastructure
on:
workflow_dispatch: # allow triggering manually
inputs:
# manually triggered runs have GITHUB_REF_NAME as a branch, not a tag so
# we need to manually map to an environment and suffix
environment_name:
description: "GitHub deployment environment to pull variables and secrets from"
required: true
type: environment
push:
branches:
- releases/**
jobs:
iac:
runs-on: ubuntu-latest
environment: ${{ inputs.environment_name }}
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Login to Azure
uses: azure/login@v2
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
# Map in all variables that start with BICEP_ from GitHub deployment environment as environment variables
- name: Set env vars from vars context JSON
id: ghenv-to-envvar
env:
# https://github.com/actions/runner/issues/1656#issuecomment-1030077729
VARS_JSON: ${{ toJSON(vars) }}
run: |
echo $VARS_JSON | jq -r 'to_entries | .[] | select(.key | startswith("BICEP_")) | "\(.key)=\(.value)"' >> $GITHUB_ENV
- id: bicep-deploy
name: "Bicep deployment"
uses: azure/arm-deploy@v1
env:
# Map in any other variables that aren't directly set in GitHub environment
BICEP_ENVIRONMENT_NAME: ${{ inputs.environment_name }}
BICEP_RESOURCE_GROUP: ${{ vars.RESOURCE_GROUP }}
BICEP_DOCKER_REGISTRY_USER: ${{ secrets.REPO_PAT_USER }}
BICEP_DOCKER_REGISTRY_PASSWORD: ${{ secrets.REPO_PAT }}
with:
subscriptionId: ${{ vars.SUBSCRIPTION_ID }}
resourceGroupName: ${{ vars.RESOURCE_GROUP }}
template: deployment/iac/main.bicep
parameters: deployment/iac/main.bicepparam
- name: Convert Bicep outputs to env-vars.json
env:
ENV_VARS_JSON: ${{ toJSON(steps.bicep-deploy.outputs) }}
run: |
# non-string outputs are stringified by GitHub Actions - parse the values as JSON when possible
jq -re 'to_entries | map({(.key): (.value | fromjson? // .)}) | add' <<< "$ENV_VARS_JSON" > env-vars.json
- name: Upload env-vars
uses: actions/upload-artifact@v4
with:
name: env-vars-json
path: env-vars.json
if-no-files-found: error
overwrite: true
# Make Bicep outputs available as prefixed env vars
# In other workflows, remember to download the artifact above first
- name: JSON to variables
shell: bash
run: |
cat bicep-output.json | jq -r '[paths(scalars) as $path | {"key": $path | map(if type == "number" then tostring else . end) | join("_"), "value": getpath($path)}] | map("\(.key)=\(.value)") | .[]' | while read line;do
echo "${{ inputs.prefix }}${line}" >> $GITHUB_ENV
done
- name: Logout of Azure
run: az logout
Explanation of Key Steps
- Set Environment Variables: Extracts environment variables that start with BICEP_ to be used in the Bicep deployment.
- Deploy Bicep Templates: Uses the azure/arm-deploy@v1 action to deploy the infrastructure defined in the Bicep templates.
- Capture Bicep Outputs: Converts the outputs from the Bicep deployment to JSON and stores them as a job output infra_outputs.
- Save Infrastructure Outputs: Writes the outputs to a file infra-outputs.json for use in other jobs or workflows.
- Upload Outputs as Artifact: Uploads the infra-outputs.json file as an artifact, making it accessible to downstream jobs.
3. Utilize Bicep Outputs in Post-Infrastructure Deployment Steps
Building upon the previous point, you can now use the Bicep outputs stored as JSON in subsequent deployment steps. This approach simplifies post-infrastructure deployments by providing easy access to resource details like names, connection strings, and endpoints.
Example: Deploying an Application Using Bicep Outputs
Below is an example of how to use the Bicep outputs in a deployment workflow to deploy an application to Azure App Service.
Application Deployment Workflow (deploy_app.yaml)
name: Deploy Application to Azure App Service
on:
workflow_call:
inputs:
environment_name:
required: true
type: string
workflow_dispatch:
inputs:
environment_name:
description: "GitHub deployment environment to pull variables and secrets from"
required: true
type: environment
deployment_run_id:
description: "Run ID of the deployment to use infrastructure variables from"
required: true
type: number
jobs:
deploy_app:
runs-on: ubuntu-latest
environment: ${{ inputs.environment_name }}
steps:
# Download the Bicep outputs artifact
- name: Download env-vars.json
uses: actions/download-artifact@v4
with:
name: env-vars-json
github-token: ${{ github.token }}
run-id: ${{ inputs.deployment_run_id || github.run_id }}
# Convert JSON outputs to environment variables with a prefix
- name: JSON to Variables
uses: antifree/json-to-variables@v1.2.0
with:
filename: env-vars.json
prefix: INFRA
- name: Azure CLI Login
uses: azure/login@v2
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
- name: Deploy Application to App Service
run: |
# Use the INFRA_* environment variables to get resource names
APP_SERVICE_NAME="$INFRA_names_appService"
RESOURCE_GROUP="$INFRA_resourceGroup"
echo "Deploying to App Service: $APP_SERVICE_NAME in Resource Group: $RESOURCE_GROUP"
# Deploy application code to the App Service
az webapp deploy \
--name "$APP_SERVICE_NAME" \
--resource-group "$RESOURCE_GROUP" \
--src-path ./app \
--type zip
- name: Azure CLI Logout
run: az logout
Explanation of Key Steps
- Download env-vars.json: Downloads the env-vars.json artifact containing the Bicep outputs from the infrastructure deployment.
- JSON to Variables: Converts the JSON file into environment variables prefixed with INFRA_ for easy access.
- Azure CLI Login: Authenticates with Azure using provided credentials.
- Deploy Application to App Service:
- Retrieves the App Service name and resource group from the environment variables set from Bicep outputs.
- Uses the Azure CLI to deploy the application code to the specified App Service.
- Azure CLI Logout: Logs out of the Azure session.
Conclusion
Modularizing pipelines with reusable workflows and converting Bicep outputs to JSON enables you to create developer-friendly CI/CD pipelines on Azure. Utilizing these outputs in subsequent deployment steps not only simplifies the process but also enhances the overall efficiency of your pipeline.
Implementing these strategies enables your development team to:
- Focus on Code Quality: Spend more time improving the application rather than managing complex pipelines.
- Increase Productivity: Reduce the overhead associated with pipeline maintenance.
- Promote Consistency: Ensure consistent deployment processes across different environments and projects.
Start adopting these practices today to streamline your CI/CD pipelines and accelerate your development workflow.
*The featured image was generated using ChatGPT’s DALL-E tool.