Today on the Semantic Kernel blog we’re excited to welcome a group of guest authors from Microsoft. We’ll turn it over to Riccardo Chiodaroli, Samer El Housseini, Daniel Labbe and Fabrizio Ruocco to dive into their use cases with Semantic Kernel and Copilot Studio.
In today’s fast-paced digital economy, intelligent automation is no longer optional—it’s an essential capability for organizations striving to remain competitive and agile. Modern business success depends not merely on adopting advanced technologies, but on seamlessly integrating them into existing operations to enhance productivity, improve customer experiences, and drive faster, data-driven decision making.
In pursuit of these capabilities, enterprises are increasingly turning towards flexible AI-powered frameworks like Semantic Kernel. Designed for versatility, scalability, and ease of integration, Semantic Kernel empowers businesses to streamline their complex workflows, democratize data access, and enhance their overall digital transformation processes.
In this post series, we will examine some advanced usage scenarios powered by Semantic Kernel, highlighting the practical business benefits and foundational technical approaches for each, enabling organizations to align these capabilities directly with their strategic objectives.
Low-Code Meets Pro-Code
Today’s enterprises rely on intelligent agents to streamline processes, increase efficiency, and improve user interactions.
Microsoft Copilot Studio has quickly emerged as an innovative, low-code tool for building these intelligent agents leveraging the vast Microsoft 365 ecosystem. Microsoft Copilot Studio enables users with varying skillsets—from citizen developers to business analysts—to rapidly deploy intelligent AI agents and automate workflows.
Yet still, real-world business scenarios can often require highly specific logic, advanced integrations with existing systems, or complex processing that standard connectors simply can’t facilitate. Leveraging a pro-code extension—with rich customization opportunities—allows companies to:
- Address business-critical requirements with bespoke logic
- Incorporate advanced AI techniques and tailored data processing
- Enhance customer service, employee productivity, or operational efficiency using specialized APIs
- Ensure continuous scalability and flexibility in response to rapidly evolving business needs
Utilizing the power of Semantic Kernel brings sophisticated AI processing and natural language understanding to your custom integrations, resulting in intelligent, contextually-aware interactions with users.
Technical Overview
To effectively bridge the gap between Copilot Studio’s easy-to-use, low-code UI and advanced custom logic, the provided sample leverages a structured architecture:
- Microsoft Copilot Studio acts as the no-code/low-code frontend for creating, managing, and orchestrating AI skills.
- Azure Bot Service serves as the main API entry point, transferring requests and responses between Copilot Studio and your custom API.
- Semantic Kernel API (Host Application), built on Microsoft’s Semantic Kernel framework, resides in Azure Container Apps. It processes requests, performs intelligent computation (using generative AI via Azure OpenAI services), and returns responses back through Azure Bot Service.
Here’s the high-level workflow:
- Copilot Studio initiates a request—such as querying complex data or triggering advanced processing.
- Azure Bot Service forwards these requests securely to a custom API running within Azure Container Apps.
- The Semantic Kernel-powered custom API executes tasks (e.g., sophisticated reasoning, generative AI interactions, integrations with your data sources), processes the information, and returns intelligent responses.
- Azure Bot Service coordinates responses back to Copilot Studio, providing users instant access to powerful, custom functionality.
An initial “handshake” to register the custom API as a Copilot Studio skill ensures seamless communications by automatically fetching the required manifest data from the Semantic Kernel application.
The eventual result is shown below. Copilot Studio allows to add skills to the agent and then invoke them in the topics flow:
The following code shows the core of the interaction: when a new message Activity is received, ChatHistory can be restored from the TurnContext and sent to a ChatCompletionAgent (or any other kind of Agent) for processing.
@bot.activity("message")
async def on_message(context: TurnContext, state: TurnState):
user_message = context.activity.text
# Get the chat_history from the conversation state
chat_history: ChatHistory = state.conversation.get("chat_history")
# Add the new user message
chat_history.add_user_message(user_message)
# Get the response from the semantic kernel agent (v1.22.0 and later)
sk_response = await agent.get_response(history=chat_history, user_input=user_message)
# Store the updated chat_history back into conversation state
state.conversation["chat_history"] = chat_history
# Send the response back to the user
# NOTE in the context of a Copilot Skill,
# the response is sent as a Response from /api/messages endpoint
await context.send_activity(MessageFactory.text(sk_response, input_hint=InputHints.ignoring_input))
# Skills must send an EndOfConversation activity to indicate the conversation is complete
# NOTE: this is a simple example, in a real skill you would likely want to send this
# only when the user has completed their task
end = Activity.create_end_of_conversation_activity()
end.code = EndOfConversationCodes.completed_successfully
await context.send_activity(end)
return True
Full code sample can be found in Semantic Kernel official demos. Also refer to Microsoft Copilot Studio documentation to learn more about skills.
Two-way synergy: Sematic Kernel Copilot Studio agents
Similarly, even pro-code solutions may benefit from leveraging low-code approach. Defining an agent visually via a graphical design tool effectively enables power users to autonomously develop and manage them, according to their own needs and pace. But then how can this be integrated into a code-based solution?
Thankfully, Semantic Kernel features a future-proof, extensible design that allows to easily plug in additional kinds of agent. In this case, Copilot Studio offers a default publishing channel for DirectLine API, that we can use to seamlessly connect from a new DirectLineAgent class.
agent = DirectLineAgent(
id="copilot_studio",
name="copilot_studio",
description="copilot_studio",
bot_secret=os.getenv("BOT_SECRET"),
bot_endpoint=os.getenv("BOT_ENDPOINT"),
)
@cl.on_chat_start
async def on_chat_start():
cl.user_session.set("chat_history", ChatHistory())
@cl.on_message
async def on_message(message: cl.Message):
chat_history: ChatHistory = cl.user_session.get("chat_history")
chat_history.add_user_message(message.content)
response = await agent.get_response(history=chat_history)
cl.user_session.set("chat_history", chat_history)
logger.info(f"Response: {response}")
await cl.Message(content=response.content, author=agent.name).send()
Here you can see a Chainlit chat app talking directly to the Copilot Studio agent, providing answers from the configured knowledge base via the default experience:
This approach enables us to then use Semantic Kernel orchestration capabilities to work across multiple types of agents, regardless they’re running locally or remotely: you can then mix this agent with Azure AI Agent ones or ChatCompletionAgent together with an AgentGroupChat.
From the Semantic Kernel team, we’d like to thank Riccardo Chiodaroli, Samar El Housseini, Daniel Lavve and Fabrizio Ruocco for their time and all of their great work.  Please reach out if you have any questions or feedback through our Semantic Kernel GitHub Discussion Channel. We look forward to hearing from you!
0 comments
Be the first to start the discussion.