The time is finally here, Semantic Kernel’s Agent framework is now Generally Available! Available today as part of Semantic Kernel 1.45 (.NET) and 1.27 (Python), the Semantic Kernel Agent framework makes it easier for agents to coordinate and dramatically reduces the code developers need to write to build amazing AI applications.
What does Generally Available mean? When we mark an API as Generally Available it means that we have high confidence in the quality of the surface for building AI applications and that we can support and maintain the API going forward. We know that a stable and supported API is important for everyone building AI and Agent enterprise applications on Semantic Kernel.
Creating agents with Semantic Kernel is easy (this example is in Python, full sample here for Python and C#):
import asyncio
from semantic_kernel.agents import ChatCompletionAgent
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion
async def main():
# Initialize a chat agent with basic instructions
agent = ChatCompletionAgent(
service=AzureChatCompletion(),
name="SK-Assistant",
instructions="You are a helpful assistant.",
)
# Get a response to a user message
response = await agent.get_response(messages="Write a haiku about Semantic Kernel.")
print(response.content)
asyncio.run(main())
Enhance your agent with custom tools (plugins) and structured output (this example is in C#, full sample here for C# and Python):
using System.ComponentModel;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Agents;
using Microsoft.SemanticKernel.ChatCompletion;
var builder = Kernel.CreateBuilder();
builder.AddAzureOpenAIChatCompletion(
Environment.GetEnvironmentVariable("AZURE_OPENAI_DEPLOYMENT"),
Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"),
Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY")
);
var kernel = builder.Build();
kernel.Plugins.Add(KernelPluginFactory.CreateFromType<MenuPlugin>());
ChatCompletionAgent agent =
new()
{
Name = "SK-Assistant",
Instructions = "You are a helpful assistant.",
Kernel = kernel,
Arguments = new KernelArguments(new PromptExecutionSettings() { FunctionChoiceBehavior = FunctionChoiceBehavior.Auto() })
};
ChatMessageContent message = new(AuthorRole.User, "What is the price of the soup special?");
await foreach (AgentResponseItem<ChatMessageContent> response in agent.InvokeAsync(message))
{
Console.WriteLine(response.Message);
// The price of the Clam Chowder, which is the soup special, is $9.99.
}
sealed class MenuPlugin
{
[KernelFunction, Description("Provides a list of specials from the menu.")]
public string GetSpecials() =>
"""
Special Soup: Clam Chowder
Special Salad: Cobb Salad
Special Drink: Chai Tea
""";
[KernelFunction, Description("Provides the price of the requested menu item.")]
public string GetItemPrice(
[Description("The name of the menu item.")]
string menuItem) =>
"$9.99";
}
Just because we’ve declared GA for the core Agent framework, that doesn’t mean we’re going to stop adding features. For example, here are some other amazing new Semantic Kernel agent features you can try:
billing_agent = ChatCompletionAgent(
service=AzureChatCompletion(),
name="BillingAgent",
instructions="You handle billing issues like charges, payment methods, cycles, fees, discrepancies, and payment failures."
)
refund_agent = ChatCompletionAgent(
service=AzureChatCompletion(),
name="RefundAgent",
instructions="Assist users with refund inquiries, including eligibility, policies, processing, and status updates.",
)
triage_agent = ChatCompletionAgent(
service=OpenAIChatCompletion(),
name="TriageAgent",
instructions="Evaluate user requests and forward them to BillingAgent or RefundAgent for targeted assistance."
" Provide the full answer to the user containing any information from the agents",
plugins=[billing_agent, refund_agent],
)
thread: ChatHistoryAgentThread = None
async def main() -> None:
print("Welcome to the chat bot!\n Type 'exit' to exit.\n Try to get some billing or refund help.")
while True:
user_input = input("User:> ")
if user_input.lower().strip() == "exit":
print("\n\nExiting chat...")
return False
response = await triage_agent.get_response(
messages=user_input,
thread=thread,
)
if response:
print(f"Agent :> {response}")
Or, connect to other managed agent platforms:
- Azure AI Agent Service (C#, Python)
- AutoGen (Python)
- AWS Bedrock (C#, Python)
- Crew AI (C#, Python)
- OpenAI Assistants (C#, Python)
Try out all of our ‘Getting Started with Agents’ samples for C# and Python, and check out all of the Agent framework documentation and samples (C#, Python)
Today’s Semantic Kernel announcement is part of a broader set of AI announcements we’re making as part of Microsoft’s 50th anniversary. You can find out more about AI Red Teaming and the AI Foundry Visual Studio Code extension in Asha’s blog post. Happy birthday Microsoft!
Thanks Shawn,
Definitely exciting times!!
Some questions, after reading this and the RC1 and RC2 announcement...
- On the "Agent as Plugin",
- When we add it, once it is invoked as a tool/plugin, will it be invoked with the main thread? so it takes part on the conversation/chat loop? :) - IMHO it should be like this.
- In .NET C# Why not adding it like in Python? ;)
At the moment in c# we have to do this:
<code>
While in Python it is just plugins=[billing_agent, refund_agent],
I'd like to do the same ;)...
I do not see the agent wrapped in the plugin function being invoked with the main agent/chat thread…
see https://github.com/microsoft/semantic-kernel/blob/0ff97c7fe73c5175df61ae627271341a32983c07/dotnet/src/Agents/Core/Functions/AgentKernelFunctionFactory.cs#L55
IMHO having this would be very convenient, so it can participate, and read the conversation thread… otherwise we have to manually pass all the context to it in order to function properly.
and here is this 🙂
https://github.com/microsoft/semantic-kernel/pull/11443