{"id":15452,"date":"2024-06-07T09:21:33","date_gmt":"2024-06-07T16:21:33","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/ise\/?p=15452"},"modified":"2024-07-18T11:57:44","modified_gmt":"2024-07-18T18:57:44","slug":"7-essential-tips-accelerate-prompt-flow-development","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/ise\/7-essential-tips-accelerate-prompt-flow-development\/","title":{"rendered":"7 Essential Tips: Accelerate Prompt Flow Development"},"content":{"rendered":"<h2>Introduction<\/h2>\n<p>In our recent engagement, we developed a Large Language Model (LLM) service using Azure Prompt Flow. Our objective was not only to fulfill the service requirements but also to enhance the developer experience. This article outlines the seven strategies we incorporated to achieve this goal.<\/p>\n<h2>1. Maintain Dev Container<\/h2>\n<p>Dev Containers offer a solution to the &#8220;it works on my machine&#8221; problem by creating a consistent and replicable development environment across different machines. This containerization of the development environment facilitates easy sharing and replication within teams, reducing bugs and ensuring smoother deployments. Visual Studio Code enhances this experience with Dev Container support, allowing environments to be defined in a <code>devcontainer.json<\/code> file and run within a Docker container.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2024\/06\/maintain-devcontainer.png\" alt=\"Alt Text\" \/><\/p>\n<p><em>Image generated by OpenAI&#8217;s DALL-E model.<\/em><\/p>\n<p>The <code>devcontainer.json<\/code> can install the following tools to make your development streamlined<\/p>\n<ul>\n<li>\n<p>VS Code Extensions for enhanced productivity:<\/p>\n<ul>\n<li><strong>prompt-flow.prompt-flow:<\/strong> Integration with Prompt Flow workflows.<\/li>\n<li><strong>ms-python.python:<\/strong> Support for Python development.<\/li>\n<li><strong>GitHub.copilot:<\/strong> AI-powered coding assistance.<\/li>\n<\/ul>\n<\/li>\n<li>\n<p>Features to equip the container with necessary tools:<\/p>\n<ul>\n<li><strong>Azure CLI with ML extension:<\/strong> For interacting with Azure services.<\/li>\n<li><strong>Git:<\/strong> Essential version control, provided by the OS.<\/li>\n<li><strong>GitHub CLI:<\/strong> For GitHub operations directly from the terminal.<\/li>\n<li><strong>Node:<\/strong> For npm package installation<\/li>\n<li><strong>Conda:<\/strong> A package and environment manager for any language.<\/li>\n<\/ul>\n<\/li>\n<li>\n<p>Shell Scripts for environment preparation and connection setup:<\/p>\n<ul>\n<li><strong>post-create.sh:<\/strong> Initializes the conda environment and installs Python packages before the container starts.<\/li>\n<li><strong>post-start.sh:<\/strong> Activates the conda environment and establishes local connections for testing after the container starts<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p>Here is an example of a <code>devcontainer.json<\/code> file:<\/p>\n<pre><code class=\"language-json\">{\r\n  \"name\": \"Ubuntu\",\r\n  \"image\": \"mcr.microsoft.com\/devcontainers\/base:ubuntu\",\r\n  \"hostRequirements\": {\r\n    \"cpus\": 2\r\n  },\r\n  \"mounts\": [\r\n    \"source=${localWorkspaceFolder},target=\/workspace,type=bind,consistency=cached\"\r\n  ],\r\n  \"containerEnv\": {\r\n    \"CONDA_ENV_NAME\": \"conda_env\"\r\n  },\r\n  \"customizations\": {\r\n    \"vscode\": {\r\n      \"extensions\": [\r\n        \"prompt-flow.prompt-flow\",\r\n        \"ms-python.python\",\r\n        \"GitHub.copilot\"\r\n      ],\r\n      \"settings\": {\r\n        \"python.terminal.activateEnvironment\": true,\r\n        \"python.condaPath\": \"\/opt\/conda\/bin\/conda\",\r\n        \"python.defaultInterpreterPath\": \"\/opt\/conda\/envs\/${containerEnv:CONDA_ENV_NAME}\",\r\n        \"python.editor.formatOnSave\": true,\r\n        \"python.editor.codeActionsOnSave\": {\r\n          \"source.organizeImports\": true\r\n        },\r\n        \"notebook.formatOnSave.enabled\": true,\r\n        \"notebook.codeActionsOnSave\": {\r\n          \"source.organizeImports\": true\r\n        }\r\n      }\r\n    }\r\n  },\r\n  \"features\": {\r\n    \"ghcr.io\/devcontainers\/features\/azure-cli\": {\r\n      \"version\": \"latest\",\r\n      \"extensions\": \"ml\"\r\n    },\r\n    \"ghcr.io\/devcontainers\/features\/git\": \"os-provided\",\r\n    \"ghcr.io\/devcontainers\/features\/github-cli\": \"latest\",\r\n    \"ghcr.io\/devcontainers\/features\/node\": \"latest\",\r\n    \"ghcr.io\/devcontainers\/features\/conda\": \"latest\"\r\n  },\r\n  \"onCreateCommand\": \"\/bin\/bash .devcontainer\/post-create.sh\",\r\n  \"postStartCommand\": \"\/bin\/bash .devcontainer\/post-start.sh\"\r\n}<\/code><\/pre>\n<p><em>Learn more about how to setup Dev Container by following <a href=\"https:\/\/code.visualstudio.com\/docs\/devcontainers\/create-dev-container\">official documentation<\/a>.<\/em><\/p>\n<h2>2. Leverage Azure Key Vault<\/h2>\n<p>Automating the retrieval of secrets like API keys and base URLs into new development environments through keyvault streamlines the setup processes. This method eliminates the repetitive manual task of entering sensitive data into a .env file.<\/p>\n<p>The <code>common.sh<\/code> script showcases an effective application of this approach, providing reusable functions for keyvault secret extraction and environment variable setup. These capabilities enable smooth integration of Prompt Flow with services such as OpenAI or Azure AI Search, enhancing the efficiency and security of development environment configurations.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2024\/06\/leverage-keyvault.png\" alt=\"Alt Text\" \/><\/p>\n<p><em>Image generated by OpenAI&#8217;s DALL-E model.<\/em><\/p>\n<p>Here&#8217;s a closer look at how the <code>common.sh<\/code> script can be utilized to enhance the efficiency development environment setup:<\/p>\n<pre><code class=\"language-shell\">\r\n# Obtains a secret from keyvault using the provided secret name\r\nget_secret() {\r\n  secret_name=\"$1\"\r\n  echo \"Retrieving $secret_name from key vault...\" &gt;&amp;2\r\n  secret_value=$(az keyvault secret show --subscription \"$AZURE_SUBSCRIPTION_ID\" -n \"$secret_name\" --vault-name \"$KEYVAULT_NAME\" --query value -o tsv)\r\n\r\n  if [ $? -ne 0 ]; then\r\n    echo \"Failed to retrieve $secret_name from keyvault\" &gt;&amp;2\r\n    exit 1\r\n  fi\r\n\r\n  echo $secret_value\r\n}\r\n\r\n# Populates the configuration entries required to create PromptFlow connections\r\nsetup_pf_connection_variables() {\r\n  # Use the dev keyvault unless otherwise specified\r\n  AZURE_SUBSCRIPTION_ID=\"${AZURE_SUBSCRIPTION_ID:-\"&lt;subscription_id&gt;\"}\"\r\n  KEYVAULT_NAME=\"${KEYVAULT_NAME:-\"&lt;keyvault_name&gt;\"}\"\r\n  # Retrieve from key vault via get_secret() function, if not exported locally\r\n  AZURE_SEARCH_KEY=\"${AZURE_SEARCH_KEY:-$(get_secret \"azure-search-key\")}\"\r\n  AZURE_SEARCH_ENDPOINT=\"${AZURE_SEARCH_ENDPOINT:-$(get_secret \"azure-search-endpoint\")}\"\r\n  AZURE_OPENAI_API_KEY=\"${AZURE_OPENAI_API_KEY:-$(get_secret \"openai-api-key\")}\"\r\n  AZURE_OPENAI_API_BASE=\"${AZURE_OPENAI_API_BASE:-$(get_secret \"openai-endpoint\")}\"\r\n}<\/code><\/pre>\n<p>You can access the full <code>common.sh<\/code> script below.<\/p>\n<pre><code class=\"language-shell\">#!\/bin\/bash\r\n# common.sh: common functions and helpers for other scripts\r\nset -euE\r\n\r\nfunction error_trap {\r\n  echo\r\n  echo \"*** script failed to execute to completion ***\"\r\n  echo \"An error occurred on line $1 of the script.\"\r\n  echo \"Exit status of the last command was $2\"\r\n}\r\n\r\ntrap 'error_trap $LINENO $?' ERR\r\n\r\n# Load environment variables from specified file (or .env), if exists\r\nload_env_file() {\r\n  env_file=\"${1:-.env}\"\r\n  if [ -f \"$env_file\" ]; then\r\n    export $(cat \"$env_file\" | sed 's\/#.*\/\/g' | xargs)\r\n  fi\r\n}\r\n\r\n# Obtains a secret from keyvault using the provided secret name\r\nget_secret() {\r\n  secret_name=\"$1\"\r\n  echo \"Retrieving $secret_name from key vault...\" &gt;&amp;2\r\n  secret_value=$(az keyvault secret show --subscription \"$AZURE_SUBSCRIPTION_ID\" -n \"$secret_name\" --vault-name \"$KEYVAULT_NAME\" --query value -o tsv)\r\n\r\n  if [ $? -ne 0 ]; then\r\n    echo \"Failed to retrieve $secret_name from keyvault\" &gt;&amp;2\r\n    exit 1\r\n  fi\r\n\r\n  echo $secret_value\r\n}\r\n\r\n# Populates the configuration entries required to create PromptFlow connections\r\nsetup_pf_connection_variables() {\r\n  # Use the dev keyvault unless otherwise specified\r\n  AZURE_SUBSCRIPTION_ID=\"${AZURE_SUBSCRIPTION_ID:-\"&lt;subscription_id&gt;\"}\"\r\n  KEYVAULT_NAME=\"${KEYVAULT_NAME:-\"&lt;keyvault_name&gt;\"}\"\r\n  # Retrieve from key vault via get_secret() function, if not exported locally\r\n  AZURE_SEARCH_KEY=\"${AZURE_SEARCH_KEY:-$(get_secret \"azure-search-key\")}\"\r\n  AZURE_SEARCH_ENDPOINT=\"${AZURE_SEARCH_ENDPOINT:-$(get_secret \"azure-search-endpoint\")}\"\r\n  AZURE_OPENAI_API_KEY=\"${AZURE_OPENAI_API_KEY:-$(get_secret \"openai-api-key\")}\"\r\n  AZURE_OPENAI_API_BASE=\"${AZURE_OPENAI_API_BASE:-$(get_secret \"openai-endpoint\")}\"\r\n}<\/code><\/pre>\n<p><em>Discover more Azure CLI Key Vault commands by following <a href=\"https:\/\/learn.microsoft.com\/en-us\/cli\/azure\/keyvault?view=azure-cli-latest\">official documentation<\/a>.<\/em><\/p>\n<h2>3. Automate Local Developer Setup<\/h2>\n<p>When working with Prompt Flow locally, establishing a connection to services such as OpenAI or Azure AI Search is necessary. This can be done manually via the Prompt Flow VS Code extension, through CLI commands, or more efficiently with shell scripts.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2024\/06\/automate-local-env-setup.png\" alt=\"Alt Text\" \/><\/p>\n<p><em>Image generated by OpenAI&#8217;s DALL-E model.<\/em><\/p>\n<p>Here is the script <code>local-create-connection.sh<\/code> that shows how you can rapidly create local connection to OpenAI and Azure AI Search.<\/p>\n<pre><code class=\"language-shell\">#!\/bin\/bash\r\nset -euE\r\n\r\n# Set the SCRIPT_PATH variable to the directory where the current script is located.\r\n# \"$0\" is a reference to the current script, \"readlink -f\" resolves its absolute file path,\r\n# and \"dirname\" extracts the directory part of this path.\r\nSCRIPT_PATH=\"$(dirname \"$(readlink -f \"$0\")\")\"\r\n\r\n# Source the common.sh script located in the same directory as this script.\r\n# Sourcing a script allows us to use its functions and variables in the current script.\r\n. $SCRIPT_PATH\/common.sh\r\n\r\n# Load any declared variables from .env file at repo root\r\nload_env_file \"$SCRIPT_PATH\/..\/.env\"\r\n\r\n# Setup the variables used for PromptFlow connections below\r\n# Call to setup_pf_connection_variables function in common.sh\r\nsetup_pf_connection_variables\r\n\r\n# create connection to openai\r\npf connection create -f \"${SCRIPT_PATH}\/..\/src\/connections\/azure_openai.template.yaml\" \\\r\n  --set api_key=\"${AZURE_OPENAI_API_KEY}\" \\\r\n  --set api_base=\"${AZURE_OPENAI_API_BASE}\"\r\n\r\n# create connection to azure search\r\npf connection create -f \"${SCRIPT_PATH}\/..\/src\/connections\/azure_ai_search.template.yaml\" \\\r\n  --set api_key=\"${AZURE_SEARCH_KEY}\" \\\r\n  --set api_base=\"${AZURE_SEARCH_ENDPOINT}\"<\/code><\/pre>\n<p><em>Learn more about managing local <code>connections<\/code> from <a href=\"https:\/\/microsoft.github.io\/promptflow\/how-to-guides\/manage-connections.html\">official prompt flow documentation<\/a>.<\/em><\/p>\n<h2>4. Automate Cloud Infrastructure Setup<\/h2>\n<p>Below, the <code>azure-create-connection.sh<\/code> script illustrates how to rapidly set up connections to OpenAI and Azure AI Search on a cloud instance of Prompt Flow.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2024\/06\/automate-cloud-infrastructure.png\" alt=\"Alt Text\" \/><\/p>\n<p><em>Image generated by OpenAI&#8217;s DALL-E model.<\/em><\/p>\n<p>Dive deep into <code>azure-create-connection.sh<\/code>.<\/p>\n<pre><code class=\"language-shell\">\r\n#!\/bin\/bash\r\nset -euE\r\n\r\n# Set the SCRIPT_PATH variable to the directory where the current script is located.\r\n# \"$0\" is a reference to the current script, \"readlink -f\" resolves its absolute file path,\r\n# and \"dirname\" extracts the directory part of this path.\r\nSCRIPT_PATH=\"$(dirname \"$(readlink -f \"$0\")\")\"\r\n\r\n# Source the common.sh script located in the same directory as this script.\r\n# Sourcing a script allows us to use its functions and variables in the current script.\r\n. $SCRIPT_PATH\/common.sh\r\n\r\n# Load any declared variables from .env file at repo root\r\nload_env_file \"$SCRIPT_PATH\/..\/.env\"\r\n\r\n# Setup the variables used for PromptFlow connections below\r\n# Call to setup_pf_connection_variables function in common.sh\r\nsetup_pf_connection_variables\r\n\r\n# Get Azure access token for POST requests\r\nACCESS_TOKEN=\"$(az account get-access-token --query accessToken -o tsv)\"\r\nurl_base_create_connection=\"https:\/\/ml.azure.com\/api\/${LOCATION}\/flow\/api\/subscriptions\/${AZURE_SUBSCRIPTION_ID}\/resourceGroups\/${AZURE_RESOURCE_GROUP_NAME}\/providers\/Microsoft.MachineLearningServices\/workspaces\/${WORKSPACE_NAME}\/Connections\"\r\n\r\n# Create Azure OpenAI connection\r\nurl_create_azure_open_ai_connection=\"${url_base_create_connection}\/${AZURE_OPENAI_CONNECTION_NAME}?asyncCall=true\"\r\n\r\necho \"Creating Azure OpenAI connection with name: $AZURE_OPENAI_CONNECTION_NAME\"\r\n\r\ncurl -s --request POST --fail \\\r\n  --url \"$url_create_azure_open_ai_connection\" \\\r\n  --header \"Authorization: Bearer $ACCESS_TOKEN\" \\\r\n  --header 'Content-Type: application\/json' \\\r\n  -d @- &lt;&lt;EOF\r\n{\r\n  \"connectionType\": \"AzureOpenAI\",\r\n  \"configs\": {\r\n    \"api_key\": \"$AZURE_OPENAI_API_KEY\",\r\n    \"api_base\": \"$AZURE_OPENAI_API_BASE\",\r\n    \"api_type\": \"azure\",\r\n    \"api_version\": \"$API_VERSION\",\r\n    \"resource_id\": \"\/subscriptions\/${AZURE_SUBSCRIPTION_ID}\/resourceGroups\/${AZURE_RESOURCE_GROUP_NAME}\/providers\/Microsoft.CognitiveServices\/accounts\/${AZURE_OPEN_AI_RESOURCE_NAME}\"\r\n  }\r\n}\r\nEOF\r\necho -e \"\\n\"\r\n\r\n# Create Azure AI Search connection\r\nurl_create_azure_ai_search_connection=\"${url_base_create_connection}\/${AZURE_AI_SEARCH_CONNECTION_NAME}?asyncCall=true\"\r\n\r\necho \"Creating Azure AI Search connection with name: $AZURE_AI_SEARCH_CONNECTION_NAME\"\r\n\r\ncurl -s --request POST --fail \\\r\n  --url \"$url_create_azure_ai_search_connection\" \\\r\n  --header \"Authorization: Bearer $ACCESS_TOKEN\" \\\r\n  --header 'Content-Type: application\/json' \\\r\n  -d @- &lt;&lt;EOF\r\n{\r\n  \"connectionType\": \"CognitiveSearch\",\r\n  \"configs\": {\r\n    \"api_key\": \"$AZURE_SEARCH_KEY\",\r\n    \"api_base\": \"$AZURE_SEARCH_ENDPOINT\",\r\n    \"api_version\": \"2023-07-01-Preview\"\r\n  }\r\n}\r\nEOF\r\necho -e \"\\n\"\r\n\r\necho \"Done!\"<\/code><\/pre>\n<p><em>Note: Keep in mind that future updates may introduce new CLI commands, eliminating the need for direct curl requests. For the latest pfazure commands and how to interact with Prompt Flow via CLI, consult the <a href=\"https:\/\/microsoft.github.io\/promptflow\/reference\/pfazure-command-reference.html?highlight=pfazure+connection#pfazure\">official documentation<\/a>.<\/em><\/p>\n<h2>5. Combine Automation Scripts and Dev Container<\/h2>\n<p>After developing scripts from <a href=\"#2-leverage-azure-key-vault\">steps 2<\/a> and <a href=\"#3-automate-local-developer-setup\">step 3<\/a>, incorporate them into your Dev Container to streamline development environment setup. This procedure kicks off with the creation of a conda environment and an Azure login, followed by the execution of a script (<code>local-create-connection.sh<\/code>) from <a href=\"#3-automate-local-developer-setup\">step 3<\/a> that creates connections to OpenAI and Azure AI search through the retrieval of API keys and base URLs from the keyvault, utilizing the Azure session for authentication. This setup paves the way for local testing within your Prompt Flow VS Code extension, promoting an efficient and streamlined development process.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2024\/06\/combine-devcontainer-automation-scripts.png\" alt=\"Alt Text\" \/><\/p>\n<p><em>Image generated by OpenAI&#8217;s DALL-E model.<\/em><\/p>\n<p>Below is the script (<code>post-start.sh<\/code>) that encapsulates this workflow, ensuring your Dev Container is primed for productive development right from the start:<\/p>\n<pre><code class=\"language-shell\">#!\/bin\/env bash\r\nset -eux\r\n\r\n# Check if already initialized\r\nINITIALIZED_MARKER=~\/.devcontainer-initialized\r\n[ -f \"$INITIALIZED_MARKER\" ] &amp;&amp; exit 0\r\n\r\n# Don't activate 'base' environment\r\n# VS Code activates its selected conda environment automatically\r\nconda config --set auto_activate_base false\r\n\r\n# Setup PromptFlow connections\r\naz login\r\nconda run -n \"$CONDA_ENV_NAME\" --live-stream .\/local-create-connections.sh\r\n\r\n# Mark this container as initialized\r\ntouch \"$INITIALIZED_MARKER\"<\/code><\/pre>\n<h2>6. Automate E2E Flow Run Locally<\/h2>\n<p>Execution of local flow runs, particularly when dealing with a sequence of standard and multiple evaluation flows, can become cumbersome and inefficient if done manually. The complexity increases when each evaluation flow depends on the output of the preceding flow. The script presented here addresses this challenge by automating the process, where the output from the <code>standard_flow_01<\/code> is passed as input to the  <code>evaluation_flow_01<\/code>, and similarly, the output of the first evaluation flow serves as input for the <code>evaluation_flow_02<\/code>. This automation significantly speeds up the execution of these interconnected flows.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2024\/06\/automate-e2e-flow-run-locally.png\" alt=\"Alt Text\" \/><\/p>\n<p><em>Image generated by OpenAI&#8217;s DALL-E model.<\/em><\/p>\n<p>The script (<code>local-run-with-evaluations.sh<\/code>) below outlines the entire automated process, starting from setting up the environment and loading variables, to executing the standard and evaluation flows in sequence, and finally, logging the run names for reference. This method not only saves time but also reduces manual errors, ensuring a smooth and efficient end-to-end flow execution.<\/p>\n<pre><code class=\"language-shell\">#!\/bin\/bash\r\n\r\nset -euE\r\n\r\n# Set the SCRIPT_PATH variable to the directory where the current script is located.\r\n# \"$0\" is a reference to the current script, \"readlink -f\" resolves its absolute file path,\r\n# and \"dirname\" extracts the directory part of this path.\r\nSCRIPT_PATH=\"$(dirname \"$(readlink -f \"$0\")\")\"\r\n\r\n# Source the common.sh script located in the same directory as this script.\r\n# Sourcing a script allows us to use its functions and variables in the current script.\r\n. $SCRIPT_PATH\/common.sh\r\n\r\n# Load any declared variables from .env file at repo root\r\nload_env_file \"$SCRIPT_PATH\/..\/.env\"\r\n\r\n# Default values\r\nexperiment_name=\"local-run\"\r\nstory_id=\"STORY-ID-NNNNN\"\r\n\r\n# Flow locations\r\nstandard_flow_01=\"${SCRIPT_PATH}\/..\/src\/flows\/standard_flow_01\"\r\nevaluation_flow_01=\"${SCRIPT_PATH}\/..\/src\/flows\/evaluation_flow_01\"\r\nevaluation_flow_02=\"${SCRIPT_PATH}\/..\/src\/flows\/evaluation_flow_02\"\r\n\r\n# This path to dataset which is relative to flow's run.yml\r\ndata_path=\"..\/..\/..\/..\/data\/tests\/app-reference-dev-prompt.jsonl\"\r\n\r\n# If experiment_name is not provided as an argument, use the current git branch\r\nif [ -z \"$experiment_name\" ]; then\r\n  experiment_name=$(git rev-parse --abbrev-ref HEAD)\r\nfi\r\n\r\n# Run names\r\ncurrent_time=\"$(date +%Y%m%d%H%M%S)\"\r\nrun_name_prefix=\"${story_id}_${experiment_name}_${current_time}\"\r\n\r\nstandard_flow_run_name=\"${run_name_prefix}-standard_flow\"\r\nevaluation_flow_1_run_name=\"${run_name_prefix}-evaluation_flow_01\"\r\nevaluation_flow_2_run_name=\"${run_name_prefix}-evaluation_flow_02\"\r\n\r\n# Step 1: Create a run for Standard Flow\r\necho \"Step 1: Standard Flow - $standard_flow_run_name\"\r\npf run create --stream -n \"$standard_flow_run_name\" \\\r\n  -f \"${standard_flow_01}\/run.yml\" \\\r\n  --data \"$data_path\"\r\n\r\n# Step 2: Create a run for Evaluation Flow 1\r\necho \"Step 2: Evaluation Flow 1 - $evaluation_flow_1_run_name\"\r\npf run create --stream -n \"$evaluation_flow_1_run_name\" \\\r\n  -f \"${evaluation_flow_01}\/run.yml\" \\\r\n  --data \"$data_path\" \\\r\n  # Reference to the standard flow from step 1\r\n  --run \"$standard_flow_run_name\"\r\n\r\n# Step 3: Create a run for Evaluation Flow 2\r\necho \"Step 3: Evaluation Flow 2 - $evaluation_flow_2_run_name\"\r\npf run create --stream -n \"$evaluation_flow_2_run_name\" \\\r\n  -f \"${evaluation_flow_02}\/run.yml\" \\\r\n  --data \"$data_path\" \\\r\n    # Reference to the evaluation flow from step 2\r\n  --run \"$evaluation_flow_1_run_name\"\r\n\r\n# Write run names to current experiment file\r\ntee \"${SCRIPT_PATH}\/..\/src\/.current_experiment.json\" &lt;&lt;EOF\r\n{\r\n  \"name\": \"${experiment_name}\",\r\n  \"flows\": {\r\n    \"standard_flow_01\": \"${standard_flow_run_name}\",\r\n    \"evaluation_flow_01\": \"${evaluation_flow_1_run_name}\",\r\n    \"evaluation_flow_02\": \"${evaluation_flow_2_run_name}\"\r\n  }\r\n}\r\nEOF\r\n\r\necho \"Done\"<\/code><\/pre>\n<p>Learn more about running flows <code>locally<\/code> by following <a href=\"https:\/\/microsoft.github.io\/promptflow\/reference\/pf-command-reference.html#pf-run\">official prompt flow documentation<\/a>.<\/p>\n<h2>7. Automate E2E Flow Run on Azure<\/h2>\n<p>This step extends the automation to Azure, allowing you to sequence flows similarly to <a href=\"#6-automate-e2e-flow-run-locally\">step 6<\/a>, but within the Azure environment. The script below mirrors the local automation, adapting it for Azure&#8217;s infrastructure, ensuring a seamless transition of your workflows to the cloud.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/ise\/wp-content\/uploads\/sites\/55\/2024\/06\/automate-e2e-flow-run-azure.png\" alt=\"Alt Text\" \/><\/p>\n<p><em>Image generated by OpenAI&#8217;s DALL-E model.<\/em><\/p>\n<p>Dive deep into <code>azure-run-flows-with-evaluations.sh<\/code>.<\/p>\n<pre><code class=\"language-shell\">#!\/bin\/bash\r\nset -euE\r\n\r\n# Set the SCRIPT_PATH variable to the directory where the current script is located.\r\n# \"$0\" is a reference to the current script, \"readlink -f\" resolves its absolute file path,\r\n# and \"dirname\" extracts the directory part of this path.\r\nSCRIPT_PATH=\"$(dirname \"$(readlink -f \"$0\")\")\"\r\n\r\n# Source the common.sh script located in the same directory as this script.\r\n# Sourcing a script allows us to use its functions and variables in the current script.\r\n. $SCRIPT_PATH\/common.sh\r\n\r\n# Load any declared variables from .env file at repo root\r\nload_env_file \"$SCRIPT_PATH\/..\/.env\"\r\n\r\n# Default values\r\nexperiment_name=\"azure-run\"\r\nstory_id=\"STORY-ID-NNNNN\"\r\n\r\n# Flow locations\r\nstandard_flow_01=\"${SCRIPT_PATH}\/..\/src\/flows\/standard_flow_01\"\r\nevaluation_flow_01=\"${SCRIPT_PATH}\/..\/src\/flows\/evaluation_flow_01\"\r\nevaluation_flow_02=\"${SCRIPT_PATH}\/..\/src\/flows\/evaluation_flow_02\"\r\n\r\n# This path to dataset which is relative to flow's run.yml\r\ndata_path=\"..\/..\/..\/..\/data\/tests\/app-reference-dev-prompt.jsonl\"\r\n\r\n# If experiment_name is not provided as an argument, use the current git branch\r\nif [ -z \"$experiment_name\" ]; then\r\n  experiment_name=$(git rev-parse --abbrev-ref HEAD)\r\nfi\r\n\r\n# Run names\r\ncurrent_time=\"$(date +%Y%m%d%H%M%S)\"\r\nrun_name_prefix=\"${story_id}_${experiment_name}_${current_time}\"\r\n\r\nstandard_flow_run_name=\"${run_name_prefix}-standard_flow\"\r\nevaluation_flow_1_run_name=\"${run_name_prefix}-evaluation_flow_01\"\r\nevaluation_flow_2_run_name=\"${run_name_prefix}-evaluation_flow_02\"\r\n\r\n# Step 1: Create a run for Standard Flow\r\necho \"Step 1: Standard Flow - $standard_flow_run_name\"\r\npfazure run create --stream -n \"$standard_flow_run_name\" \\\r\n  --subscription \"$AZURE_SUBSCRIPTION_ID\" \\\r\n  -g \"$AZURE_RESOURCE_GROUP_NAME\" \\\r\n  -w \"$WORKSPACE_NAME\" \\\r\n  -f \"${standard_flow_01}\/run.yml\" \\\r\n  --data \"$data_path\"\r\n\r\n# Step 3: Create a run for Evaluation Flow 01\r\necho \"Step 2: Evaluation Flow 01 - $evaluation_flow_1_run_name\"\r\npfazure run create --stream -n \"$evaluation_flow_1_run_name\" \\\r\n  --subscription \"$AZURE_SUBSCRIPTION_ID\" \\\r\n  -g \"$AZURE_RESOURCE_GROUP_NAME\" \\\r\n  -w \"$WORKSPACE_NAME\" \\\r\n  -f \"${evaluation_flow_01}\/run.yml\" \\\r\n  --data \"$data_path\" \\\r\n  # Reference to the standard flow from step 1\r\n  --run \"$standard_flow_run_name\"\r\n\r\n# Step 3: Create a run for Evaluation Flow 02\r\necho \"Step 3: Evaluation Flow 02 - $evaluation_flow_2_run_name\"\r\npfazure run create --stream -n \"$evaluation_flow_2_run_name\" \\\r\n  --subscription \"$AZURE_SUBSCRIPTION_ID\" \\\r\n  -g \"$AZURE_RESOURCE_GROUP_NAME\" \\\r\n  -w \"$WORKSPACE_NAME\" \\\r\n  -f \"${evaluation_flow_02}\/run.yml\" \\\r\n  --data \"$data_path\" \\\r\n  # Reference to the evaluation flow from step 2\r\n  --run \"$evaluation_flow_1_run_name\"\r\n\r\n# Write run names to current experiment file\r\ntee \"${SCRIPT_PATH}\/..\/src\/.current_experiment.json\" &lt;&lt;EOF\r\n{\r\n  \"name\": \"${experiment_name}\",\r\n  \"flows\": {\r\n    \"standard_flow_01\": \"${standard_flow_run_name}\",\r\n    \"evaluation_flow_01\": \"${evaluation_flow_1_run_name}\",\r\n    \"evaluation_flow_02\": \"${evaluation_flow_2_run_name}\"\r\n  }\r\n}\r\nEOF\r\n\r\necho \"Done\"<\/code><\/pre>\n<p><em>Learn more about running flows on <code>Azure<\/code> by following <a href=\"https:\/\/microsoft.github.io\/promptflow\/reference\/pf-command-reference.html#pf-run\">official Prompt Flow documentation<\/a>.<\/em><\/p>\n<h2>Conclusion<\/h2>\n<p>This blog post showcases efficiency in Prompt Flow development by highlighting key strategies like leveraging Dev Containers for consistent environments, utilizing Azure Key Vault for secure secret management, and incorporating automation scripts to streamline setup and execution processes. Together, these reusable patterns form a comprehensive blueprint, empowering developers to refine and elevate their team&#8217;s workflow.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This post focusses on tips and tricks to accelerate from flow development through use of dev container &amp; shell scripts.<\/p>\n","protected":false},"author":128246,"featured_media":15453,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1,3451],"tags":[3530,3529],"class_list":["post-15452","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-cse","category-ise","tag-co-pilot","tag-promptflow"],"acf":[],"blog_post_summary":"<p>This post focusses on tips and tricks to accelerate from flow development through use of dev container &amp; shell scripts.<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/posts\/15452","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/users\/128246"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/comments?post=15452"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/posts\/15452\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/media\/15453"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/media?parent=15452"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/categories?post=15452"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/ise\/wp-json\/wp\/v2\/tags?post=15452"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}