How to use plugins with Semantic Kernel

Nilesh Acharya

Image skpatternlarge

Have you wondered how you can use plugins with Semantic Kernel? Or maybe you want to use plugins for your own co-pilot chat app just like our sample app?

Well, I have answers for you. I will guide you an example that uses Semantic Kernel with a GitHub Open API plug-in to build a console chat experience using action planner and a chat completion skill. 

To get the best of this blog, go ahead and clone the Semantic repo and hit that star button!

Let’s get started. Once you have cloned the repository navigate to ‘semantic-kernel/samples/dotnet/openapi-skills’. The example requires a small setup, which is all defined in the appsettings.json file. You need to update AIService and GitHub plug-in parameters. This defines if you will be using Azure OpenAI or OpenAI and which GitHub repo you want to use along with the GitHub access token. If you don’t have an access token, you can get one from here.

"AIService": {
    "Type": "AzureOpenAI", // 
    "Endpoint": "", // ignored when AIService is "OpenAI"
    "CompletionTokenLimit": 4096,
    // "Key": ""
    "Models": {
      "Completion": "gpt-35-turbo" // For OpenAI, change to 'gpt-3.5-turbo' (with a period).
    }
  },
 "GitHubSkill": {
    "Owner": "microsoft",
    "Repository": "semantic-kernel"
    // "Key": ""
  },

If you are Visual Studio (2019 or newer), right-click on the OpenApiSkillsExample project, select “Set as Startup Project”, then press F5 to run and debug the application. Or open a terminal window, change directory to the OpenApiSkillsExample project, then run dotnet run.

If you are interested to learn more about how these settings are used, refer to the Program.cs file, where the ‘magic’ happens. Let’s step through what is going on.

The top of the file loads assembles, and dependencies, followed by loading configuration from appsettings.json and loading logger for debugging.

Then the Semantic Kernel is initialized utilizing the preferred AIService with some validation.

        IKernel kernel = Kernel.Builder
            .WithLogger(loggerFactory.CreateLogger<IKernel>())
            .Configure(c =>
            {
                switch (aiOptions.Type)
                {
                    case AIServiceOptions.AIServiceType.AzureOpenAI:
                        c.AddAzureChatCompletionService(aiOptions.Models.Completion, aiOptions.Endpoint, aiOptions.Key);
                        break;
                    case AIServiceOptions.AIServiceType.OpenAI:
                        c.AddOpenAIChatCompletionService(aiOptions.Models.Completion, aiOptions.Key);
                        break;
                    default:
                        throw new InvalidOperationException($"Unhandled AI service type {aiOptions.Type}");
                }
            })
            .Build();

Next we register the GitHub plug-in using an OpenAPI definition, and for this example we are referencing the pull request GET Operations. Feel free to experiment by adding other GitHub operations to get more familar with this concept.

        GitHubSkillOptions gitHubOptions = configuration.GetRequiredSection(GitHubSkillOptions.PropertyName).Get<GitHubSkillOptions>()
            ?? throw new InvalidOperationException($"Missing configuration for {GitHubSkillOptions.PropertyName}.");

        BearerAuthenticationProvider authenticationProvider = new(() => Task.FromResult(gitHubOptions.Key));

        await kernel.ImportOpenApiSkillFromFileAsync(
            skillName: "GitHubSkill",
            filePath: Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location)!, "GitHubSkill/openapi.json"),
            authCallback: authenticationProvider.AuthenticateRequestAsync);

Now we have our plug-in defined we can setup the Semantic Kernel planner to invoke the GitHub plug-in. We will use the action planner which is great for 0 or 1 step plans. If you need to use chaining or want to have more steps in your plan the Sequential Planner would be great alternative.

We create a new chat instance and provide a prompt to define the experience we want from the Bot.

ActionPlanner planner = new(kernel, logger: logger);

// Chat loop
IChatCompletion chatGPT = kernel.GetService<IChatCompletion>();
OpenAIChatHistory chatHistory = (OpenAIChatHistory)chatGPT.CreateNewChat("You are a helpful, friendly, intelligent assistant that is good at conversation.");
while (true)
{
Console.WriteLine("----------------");
Console.Write("Input: ");
string? input = Console.ReadLine();

if (string.IsNullOrWhiteSpace(input))
{
continue;
}

// Add GitHub's response, if any, to the chat history.
int planResultTokenAllowance = (int)(aiOptions.TokenLimit * 0.25); // Allow up to 25% of our token limit to be from GitHub.
string planResult = await PlanGitHubSkill(gitHubOptions, planner, chatHistory, input, planResultTokenAllowance);
if (!string.IsNullOrWhiteSpace(planResult))
{
chatHistory.AddUserMessage(planResult);
}

// Add the user's input to the chat history.
chatHistory.AddUserMessage(input);

That’s it! We now have a console chat bot using an OpenAI plug-in with Semantic Kernel! Enjoy!

Please share in the comments what you think and what plugins you tried!

Next Steps:

Try the Co-pilot Sample Application and explore with other plug-ins such Jira, Klarna Shopping, Microsoft Graph and GitHub.

Join the community and let us know what you think: https://aka.ms/sk/discord

Image skpatternsmallbw

0 comments

Discussion is closed.

Feedback usabilla icon