Three weeks ago we released the Release the Agents! SK Agents Framework RC1 | Semantic Kernel and we’ve been thrilled to see the momentum grow. Thank you to everyone who has shared feedback, filed issues, and started building with agents in Semantic Kernel—we’re seeing more developers try agents than ever before.
Today, we’re declaring build 1.43 (.NET) and 1.26.1 (Python) as Release Candidate 2 of the Semantic Kernel Agent Framework.
With this release, we’re introducing a small but impactful change to how agents handle chat message threads —one that sets the stage for powerful new capabilities coming soon.
From ChatMessageContent
to AgentResponseItem<T>
In RC1, you would create a ChatHistory, add a ChatMessageContent, and when you called InvokeAsync()
on an agent with the ChatHistory, you received a ChatMessageContent
in return:
ChatHistory chat = [];
ChatMessageContent message = new(AuthorRole.User, "What is the special soup and how much does it cost?");
chat.Add(message);
await foreach (ChatMessageContent response in agent.InvokeAsync(chat))
{
chat.Add(response);
Console.WriteLine(response.Content);
}
In RC2, InvokeAsync()
now returns an AgentResponseItem<ChatMessageContent>
instead, which has an AgentThread associated with it:
await foreach (AgentResponseItem<ChatMessageContent> response in agent.InvokeAsync(message, agentThread)) //agentThread is optional
{
agentThread = response.Thread
Console.WriteLine(response.Content)
}
In python, this looks like:
response = await agent.get_response(
messages=user_input,
thread=thread, #optional
)
print(f"# {response.name}: {response}")
thread = response.thread
The new common Invoke
methods that we are introducing allow you to provide the message(s) that you want to pass to the agent and an optional AgentThread
. If an AgentThread
is provided, this will continue the conversation already on the AgentThread
. If no AgentThread
is provided, a new default thread will be created and returned as part of the response. We’ve made this change across all of our agent types, ChatCompletion, AzureAIAgent, Bedrock and OpenAIAssistant.
We have many samples using the new API pattern in our samples (C#, Python).
Why Make This Change Now?
We take platform stability seriously on the Semantic Kernel team. As we approach general availability, we carefully evaluated whether to introduce this breaking change in RC2. Ultimately, we decided that doing it now—before locking down the API—is the right call to ensure long-term flexibility and developer clarity.
We’ve designed this change to be easy to adopt, and we’re here to help you migrate.
What’s Next?
We don’t expect RC2 to stick around long—we’ll be evaluating this update over the next few days and listening to your feedback. Please let us know what you think on GitHub Discussions!
The General Availability (GA) release of the Agent Framework is just around the corner. In the meantime, try out RC2, explore the new return type, and stay tuned for announcements around pluggable threads implementations, multi-agent orchestration, and more.
Let’s build great agents together.
Is this example complete? I understand that the output type is there and that is a breaking change, but it also seems to mean that there are no non-obsolete instances of the Invoke methods that accept a chat-history object.
Does that mean that ChatHistory is also going away and we should use List of ChatMessageContent instead?
.. or just a ChatHistoryAgentThread object… It seems now that no matter what you need to provide a ChatMessageContent object…. In the past, you could add your input from the user to a ChatHistory and only pass that along. Had expected to be able to do the same with a ChatHistoryAgentThread, but no such overload exist