Customer Case Study: DataStax and Semantic Kernel

Sophia Lagerkrans-Pandey

Greg Stachnick

Today we’ll dive into a customer case study from Datastax and their recent press release and announcement on the DataStax and Microsoft collaboration on RAG capabilities on DataStax Astra DB Thanks again to the DataStax team for their amazing partnership!

Microsoft and DataStax Simplify Building AI Agents with Legacy Apps and Data

In the ever-evolving landscape of artificial intelligence (AI) development, bridging the gap between legacy applications and cutting-edge AI technologies is a challenge for many enterprises. Companies often have hundreds or even thousands of existing applications that they want to bring into the AI world. Recognizing this challenge, Microsoft and DataStax have joined forces to simplify the process of building AI agents with legacy apps and data. Their latest partnership announcement combines AI development by enabling seamless integration of DataStax Astra DB with Microsoft’s Semantic Kernel.

Microsoft’s Semantic Kernel is an open-source SDK that helps solve this challenge, by making it easy to build generative AI agents that can call existing code. We’re excited to announce the new integration of Semantic Kernel and DataStax Astra DB that enables developers to build upon their current codebase more easily, vectorize the data, and build production-grade GenAI apps and AI agents that utilize the relevance and precision provided by retrieval-augmented generation (RAG).

 What’s so cool about Semantic Kernel – shared by DataStax

Semantic Kernel is a GenAI/RAG application and agent orchestration framework in Microsoft’s stack of AI copilots and models. In many ways, it’s similar to LangChain and LlamaIndex, but with more focus on enabling intelligent agents. Semantic Kernel provides capabilities for managing contextual conversations including previous chats, prompt history, and conversations, as well as planners for multi-step functions and connections (plug-ins) for third-party APIs to enable RAG grounded in enterprise data (learn more about why RAG is critical to generating responses that aren’t only contextually accurate but also information-rich here).

Another cool thing about Semantic Kernel is that prompts written for a Python version during app iteration can be used by the C# version for much faster execution at runtime. Semantic Kernel is also proven on Microsoft Azure for Copilot and has reference frameworks for developers to build their own scalable copilots with Azure.

Introducing the Astra DB Connector

DataStax has contributed the Astra DB connector in Python. This connector enables Astra DB to function as a vector database within Semantic Kernel. It’s a game-changer for developers building RAG applications that want to use Semantic Kernel’s unique framework features for contextual conversations or intelligent agents, or for those targeting the Microsoft AI and Azure ecosystem. The integration allows for the storage of embeddings and the performance of semantic searches with unprecedented ease.

By combining Semantic Kernel with Astra DB, developers can build powerful RAG applications with extended contextual conversation capabilities (such as managing chat and prompt histories) and multi-function or planner capabilities, on a globally scalable vector database proven to give more relevant and faster query responses.

A performance booster for Python developers

While this release will benefit a broad swath of the GenAI developer community, it’s of particular interest to those who work in the Microsoft/Azure ecosystem. By integrating Astra DB directly into Semantic Kernel, developers can now leverage Astra DB as a data source in their existing applications, streamlining the development process and enhancing application performance.

To add Astra DB support to a Semantic Kernel application, simply import the module and register the memory store:

# import Astra DB connector
import semantic_kernel as sk
from semantic_kernel.connectors.memory.astradb import AstraDBMemoryStore

# create Astra memory store

#register Astra memory in Semantic Kernel Memory
memory = SemanticTextMemory(storage=store, embeddings_generator=kernel.get_service("text_embedding"))


The integration of Semantic Kernel and Astra DB extends beyond technical enhancements, paving the way for a range of business use cases from personalized customer service to intelligent product recommendations and beyond. It’s not just about making development easier; it’s about enabling the creation of more intelligent, responsive, and personalized AI applications that can transform industries.

For more information about this collaboration, visit the following links from DataStax:

Please reach out if you have any questions or feedback through our Semantic Kernel GitHub Discussion Channel. We look forward to hearing from you! We would also love your support, if you’ve enjoyed using Semantic Kernel, give us a star on GitHub.


Leave a comment

Feedback usabilla icon