Build 2024 Recap: Bridging the chasm between your ML and app devs

Matthew Bolanos

Last week, Semantic Kernel had a huge milestone during Microsoft Build. Both its Python and Java libraries achieved V1.0 status, guaranteeing that neither of them would have breaking changes for non-experimental features moving forward. This is a big deal for our customers, as it means they can now confidently build their applications on top of our libraries without worrying about future compatibility issues.

More importantly though, this means that all three SDKs (Python, C#, and Java) all share the same core functionality. This is a huge win for our customers, as it means that they can now easily switch between languages. This is especially important for our customers who have teams that use differing languages, as it means that they can now easily share code without having to worry about compatibility issues.

For most customers, development starts with AI researchers and data scientists building POCs in Python. When it then comes time to deploy these models into production, the enterprise app developers can then take over. This is where the new V1.0 status of our libraries really shines, as it means that app developers can now easily take the prompts and plugins built by AI researchers and instantly deploy them into production using an enterprise-grade AI SDK.

To demonstrate just how easy it is to bridge the gap between AI researchers and app developers, we recorded the live demo we did at Microsoft Build.

In this demo, we show how an AI researcher can build an AI agent with a set of OpenAPI defined plugins in Python, and then transfer those exact same plugins to Java and .NET developers so they can recreate the same agent in a production environment. This is a huge win for our customers, as it means that they can now easily bridge the chasm between their AI researchers and app developers.


Semantic Kernel Community Video

Thank you again for our amazing Semantic Kernel community for sending in your videos!  If you missed the video at Microsoft BUILD watch it here.


But that’s not all…

In addition to launching V1.0 of our Python and Java libraries, we also launched a host of additional features that make it even easier to build enterprise-grade AI applications. These include:

  • Logic Apps as plugins: Easily let your AI agents interact with existing systems and services in your organization with no code Logic Apps.
  • Open Telemetry semantic convention support: Integrate the telemetry from your AI agents with your existing observability tools like .NET Aspire.
  • Hooks and filters: Have control over the lifecycle of your AI agents with hooks and filters so that you can implement approvals, semantic caching, and more.
  • Azure Container Apps Dynamic Sessions: Recreate the Code Interpreter experience from the Assistance API in your own AI agents with plugins that can run their own isolated Python sessions.


On-Demand Microsoft BUILD Sessions

We recommend you checking out these sessions which featured Semantic Kernel to learn more about getting started and building production AI with Semantic Kernel:


We’re excited to see what you build with these new features. If you have any questions or need help getting started, feel free to reach out to us on our GitHub discussions page.


Leave a comment

Feedback usabilla icon