Connect Logic Apps’ 1,400 connectors to Semantic Kernel

Matthew Bolanos

At the heart of every company are custom APIs that are built to solve specific business problems. These APIs often use Logic Apps, which is a powerful way to build APIs without writing code. With Logic Apps’ over 1,400 enterprise connectors, you can connect to your enterprise systems, transform data, and automate processes to solve your business needs.

We’ve seen customers build Logic Apps to help…

  • Automate development processes
  • Connect to legacy systems
  • Integrate with third-party services
  • And more!

Often, however, using these APIs is still a manual process. End-users need to visit websites, fill out forms, and wait for responses. What if instead, AI agents could interact with these APIs on behalf of the end-users? This is where Semantic Kernels come in.

Semantic Kernel’s plugins

Within Semantic Kernel, it’s possible to create plugins that interact with any logic, whether its native code or an API. To make it even easier to import APIs as plugins, we’ve provided a simple import method (across all three SDKs) that takes any OpenAPI specified API and converts it into a plugin.

C#

await kernel.ImportPluginFromOpenApiAsync(
    pluginName: "openapi_plugin",
    uri: new Uri("https://githubworkflowsplugin.azurewebsites.net/swagger.json"),
    executionParameters: new OpenApiFunctionExecutionParameters()
    {
        ServerUrlOverride = new Uri(lightPluginEndpoint),
        EnablePayloadNamespacing = true
    }
).Wait();

Python

await kernel.add_plugin_from_openai(
    plugin_name="openapi_plugin",
    plugin_url="https://githubworkflowsplugin.azurewebsites.net/swagger.json",
    execution_settings=OpenAPIFunctionExecutionParameters(
        http_client=http_client,
        enable_payload_namespacing=True,
    ),
)

Java

// Available as a sample at https://github.com/microsoft/semantic-kernel/tree/java-v1/java/samples/semantickernel-openapi-plugin

KernelPlugin openAPIPlugin = SemanticKernelOpenAPIImporter
    .builder()
    .withPluginName("light_plugin")
    .withServer("https://githubworkflowsplugin.azurewebsites.net")
    .withSchema(swaggerSchema)
    .build();

Kernel kernel = Kernel.builder()
    .withAIService(ChatCompletionService.class, chat)
    .withPlugin(openAPIPlugin)
    .build();

This means you only need to author an API once and then you can use it in any language.

You can now import Logic Apps as plugins

Because Logic Apps now supports OpenAPI specifications and the OpenAI plugin manifest, you can easily take any of the APIs you have previously built and import them as a plugin. To enable this functionality, you simply need to update your host.json file to include two new unauthorized endpoints:

  • /.well-known/ai-plugin.json
  • /swagger.json

Afterwards, you can apply EasyAuth to your individual APIs to secure them so that they can be used with Semantic Kernel.

Building a GitHub Copilot with Logic Apps and Semantic Kernel

Many customers today already have Logic Apps that help them manage their GitHub repositories and build automation processes. With Semantic Kernel, you can now take these same Logic Apps and give to your AI agents as tools. This means your AI agents can now help you identify PRs that need attention, explain why a build failed, and even help you write code.

As part of our push to v1 for each of our SDKs, we created a Logic App that would help us identify differences between our implementations. This included building the following workflows:

Image Screenshot 2024 05 20 at 7 12 04 PM

Each workflow helps automate a different part of the development process:

  • Search files – to find code snippets that are relevant to a given problem
  • Get file – to retrieve the contents of a file in the GitHub repository
  • Get PR details – to retrieve the details of a PR (e.g., the PR title, description, and author)
  • Get PR files – to retrieve the files that were changed in a PR
  • Get build and test failures – to retrieve the build and test failures for a given GitHub action run
  • Get log file – to retrieve the log file for a given GitHub action run

The most complicated of these Logic Apps (Get build and test failures) would have traditionally required a developer to write hundreds of lines of code, but with Logic Apps, it’s as simple as dragging and dropping connectors to retrieve the necessary information, parse it, and return the specific problems identified in the build. This increases the speed at which you can build these APIs for users and as of today increases the speed at which you can augment the capabilities of your AI agents.

After importing these APIs into Semantic Kernel, our agent can now use them to 1) explain concepts in Semantic Kernel, 2) identify issues in PRs, and 3) help write code. Below is a real transcript of a conversation between a developer and the AI agent:

User

I’m reviewing https://github.com/microsoft/semantic-kernel/pull/6318, can you quickly summarize what’s changing?

ToolCall

GitHub_GetFilesInPR_API_invoke({“pr”:”6318″})

Assistant

The PR #6318 in the Semantic Kernel repository has several changes primarily focusing on integration tests and some enhancement to plugins and settings. Here’s a summary of what’s changing:

Updated Files

  1. Integration Test Data Updates and Additions:
    • Various JSON data files for expected results in tests (`prompt_with_chat_roles_expected.json`, `light_bulb_api.json`, `prompt_simple_expected.json`, etc.)
    • YAML test configuration files (`simple_prompt_test.yaml`, `prompt_with_chat_roles_test_j2.yaml`, etc.)
  2. Test Scripts:
    • Updated the main testing script for cross-language integration tests (`test_cross_language.py`).
  3. Connector and Plugin Enhancements:
    • Enhanced the Open AI prompt execution settings module (`open_ai_prompt_execution_settings.py`).
    • Provided a new plugin sample using OpenAI with Azure Key Vault integration (`openai_plugin_azure_key_vault.py`).

This PR appears to focus on enhancing testing infrastructure and slight modifications in connectors and plugin configurations to support broader or more specific test scenarios and security features.

User

Can you tell me how the execution settings are changing compared to main?

ToolCall

GitHub_GetFile_API_invoke({“file_path”:”python/semantic_kernel/connectors/ai/open_ai/prompt_execution_settings/open_ai_prompt_execution_settings.py”,”ref”:”heads/main”,”repo”:”microsoft/semantic-kernel”})

ToolCall

GitHub_GetFile_API_invoke({“file_path”:”python/semantic_kernel/connectors/ai/open_ai/prompt_execution_settings/open_ai_prompt_execution_settings.py”,”ref”:”pull/6318/head”,”repo”:”microsoft/semantic-kernel”})

Assistant

The changes in the `OpenAIPromptExecutionSettings` in the PR relative to the main branch involve several flexibility enhancements and adjustments to default values, providing a more dynamic and optionally specified configuration. Here are the primary changes:

  1. Type Adjustments and Optional Fields:

    • Many fields such as `ai_model_id`, `frequency_penalty`, `logit_bias`, `max_tokens`, `number_of_responses`, `presence_penalty`, `seed`, `temperature`, and `top_p` have been changed to either `Optional` or introduced type unions allowing for `None` values. This permits these parameters to be unspecified, enhancing default handling and dynamic configuration.
  2. Removal of Fixed Default Values:
    • Default values for fields like `frequency_penalty`, `temperature`, `top_p`, and others have been removed, suggesting that these settings will either be inherited from a broader context or will need explicit specification when used.
  3. Consistency and Validation Logic:
    • The internal validation within the settings class remains consistent, ensuring logical correctness when both `best_of` and `number_of_responses` settings are used together.

These changes suggest a move towards more adaptable and potentially context-dependent execution settings for OpenAI integrations in Semantic Kernel, likely aiming to accommodate a wider range of scenarios or simpler integration without cumbersome mandatory settings.

User

Can you compare this with the .NET implementation? Are the behaviors the same?

ToolCall

GitHub_SearchFiles_API_invoke({“language”:”c#”,”query”:”OpenAIPromptExecutionSettings”,”ref”:”heads/main”})

ToolCall

GitHub_GetFile_API_invoke({“file_path”:”dotnet/src/Connectors/Connectors.OpenAI/OpenAIPromptExecutionSettings.cs”,”ref”:”heads/main”,”repo”:”microsoft/semantic-kernel”})

Assistant

The .NET implementation of `OpenAIPromptExecutionSettings` in the Semantic Kernel significantly aligns with the Python version in functionality but exhibits some differences in structure and additional behaviors:

Assistant

Similarities:

  1. Parametrization: Both implementations provide parameters like `Temperature`, `TopP`, `PresencePenalty`, `FrequencyPenalty`, `MaxTokens`, and setup for `StopSequences`.
  2. Dynamism: They allow flexibility with optional or nullable parameters, enabling dynamic configuration based on varying conditions.
  3. Validation and Adaptation: Both are designed to ensure logical and functional validation for parameter interactions, keeping the behavior similar across executions.

….

Getting started

To get started, you can follow the steps below to import your Logic Apps as plugins into Semantic Kernel:

  1. Enable metadata endpoints for Logic App
  2. Configure your application for Easy Auth
  3. Use Logic Apps with Semantic Kernel as a plugin

Enable metadata endpoints for Logic App

For the easiest setup, you can enable unauthenticated access to the metadata endpoints for your Logic App. This will allow you to import your Logic App as a plugin into Semantic Kernel without needing to create a custom HTTP client to handle authentication for the initial import.

The below host.json file will create two unauthenticated endpoints. You can do this in azure portal by going to kudu console and editing the host.json file located at C:\home\site\wwwroot\host.json.

{ 
  "version": "2.0", 
  "extensionBundle": { 
    "id": "Microsoft.Azure.Functions.ExtensionBundle.Workflows", 
    "version": "[1.*, 2.0.0)" 
  }, 
  "extensions": { 
    "http": { 
      "routePrefix": "" 
    }, 
    "workflow": { 
      "MetadataEndpoints": { 
        "plugin": { 
          "enable": true, 
          "Authentication":{ 
              "Type":"Anonymous" 
          } 
        }, 
        "openapi": { 
          "enable": true, 
          "Authentication":{ 
              "Type":"Anonymous" 
          } 
        } 
      }, 
      "Settings": { 
        "Runtime.Triggers.RequestTriggerDefaultApiVersion": "2020-05-01-preview" 
      } 
    } 
  } 
} 

Configure your application for Easy Auth

You now want to secure your Logic App workflows so only authorized users can access them. You can do this by enabling Easy Auth on your Logic App. This will allow you to use the same authentication mechanism as your other Azure services, making it easier to manage your security policies.

For an in-depth walkthrough on setting up Easy Auth, refer to this great tutorial by Arjun Chiddarwar titled Trigger workflows in Standard logic apps with Easy Auth.

For those already familiar with Easy Auth (and already have an Entra client app you want to use), this is the configuration you’ll want to post to Azure management.

PUT https://management.azure.com/subscriptions/80d4fe69-c95b-4dd2-a938-9250f1c8ab03/resourceGroups/<resource_group>/providers/Microsoft.Web/sites/<app_name>/config/authsettingsV2?api-version=2022-03-01 HTTP/1.1
content-type: application/json
Authorization: Bearer <ARM_Token>

{
    "properties": {
        "platform": {
            "enabled": true,
            "runtimeVersion": "~1"
        },
        "globalValidation": {
            "requireAuthentication": true,
            "unauthenticatedClientAction": "AllowAnonymous"
        },
        "identityProviders": {
            "azureActiveDirectory": {
                "enabled": true,
                "registration": {
                    "openIdIssuer": "https://sts.windows.net/[TENANT_ID]/",
                    "clientId": "[AAD_CLIENT_ID]"
                },
                "validation": {
                    "jwtClaimChecks": {},
                    "allowedAudiences": [
                        "api://[AAD_CLIENT_ID]"
                    ],
                    "defaultAuthorizationPolicy": {
                        "allowedPrincipals": {
                            "identities": [
                                "[OBJECT_ID_FOR_USER1]",
                                "[OBJECT_ID_FOR_USER2]",
                                "[OBJECT_ID_FOR_APP1]"
                            ]
                        }
                    }
                }
            },
            "facebook": {
                "enabled": false,
                "registration": {},
                "login": {}
            },
            "gitHub": {
                "enabled": false,
                "registration": {},
                "login": {}
            },
            "google": {
                "enabled": false,
                "registration": {},
                "login": {},
                "validation": {}
            },
            "twitter": {
                "enabled": false,
                "registration": {}
            },
            "legacyMicrosoftAccount": {
                "enabled": false,
                "registration": {},
                "login": {},
                "validation": {}
            },
            "apple": {
                "enabled": false,
                "registration": {},
                "login": {}
            }
        }
    }
}

Use Logic Apps with Semantic Kernel as a plugin

Now that you have your Logic App secured and the metadata endpoints enabled, you’ve finished all the hard parts. You can now import your Logic App as a plugin into Semantic Kernel using the OpenAPI import method.

When you create your plugin, you’ll want to provide a custom HTTP client that can handle the authentication for your Logic App. This will allow you to use the plugin in your AI agents without needing to worry about the authentication.

Below is an example in C# that leverages interactive auth to acquire a token and authenticate the user for the Logic App.

string ClientId = "[AAD_CLIENT_ID]";
string TenantId = "[TENANT_ID]";
string Authority = $"https://login.microsoftonline.com/{TenantId}";
string[] Scopes = new string[] { "api://[AAD_CIENT_ID]/SKLogicApp" };

var app = PublicClientApplicationBuilder.Create(ClientId)
            .WithAuthority(Authority)
            .WithDefaultRedirectUri() // Uses http://localhost for a console app
            .Build();

AuthenticationResult authResult = null;
try
{
    authResult = await app.AcquireTokenInteractive(Scopes).ExecuteAsync();
}
catch (MsalException ex)
{
    Console.WriteLine("An error occurred acquiring the token: " + ex.Message);
}

// Add the Logic App plugin to the kernel with the OpenAI import method
kernel.Plugins.Add(await kernel.CreatePluginFromOpenAIAsync(
    pluginName: "[NAME_OF_PLUGIN]",
    uri: new Uri("https://[LOGIC_APP_NAME].azurewebsites.net/.well-known/ai-plugin.json"),
    executionParameters: new OpenAIFunctionExecutionParameters()
    {
        HttpClient = new HttpClient()
        {
            DefaultRequestHeaders =
            {
                Authorization = new AuthenticationHeaderValue("Bearer", authResult.AccessToken)
            }
        },
    }
));

// ... or simply use the OpenAPI import method
kernel.Plugins.Add(await kernel.ImportPluginFromOpenApiAsync(
    pluginName: "[NAME_OF_PLUGIN]",
    uri: new Uri("https://[LOGIC_APP_NAME].azurewebsites.net/swagger.json"),
    executionParameters: new OpenApiFunctionExecutionParameters()
    {
        HttpClient = new HttpClient()
        {
            DefaultRequestHeaders =
            {
                Authorization = new AuthenticationHeaderValue("Bearer", authResult.AccessToken)
            }
        },
    }
));

Congratulations!

You’ve now successfully imported your Logic App as a plugin into Semantic Kernel. You can now use your Logic App in your AI agents to help automate your development processes, connect to legacy systems, integrate with third-party services, and more!

Let us know how you’re using Logic Apps with Semantic Kernel in the comments below. In the future we’ll only further deepen the connection between Semantic Kernel and Logic Apps, so we’d love to hear about your experiences and how we can help you build even more powerful AI agents.

0 comments

Leave a comment

Feedback usabilla icon