January 14th, 2025

Understanding Semantic Kernel AI Connectors

AI Connectors in Semantic Kernel are components that facilitate communication between the Kernel’s core functionalities and various AI services. They abstract the intricate details of service-specific protocols, allowing developers to seamlessly interact with AI services for tasks like text generation, chat interactions, and more.

 

Using AI Connectors in Semantic Kernel

Developers utilize AI connectors to connect their applications to different AI services efficiently. The connectors manage the requests and responses, providing a streamlined way to leverage the power of these AI services without needing to handle the specific communication protocols each service requires.

 

Creating Custom AI Connectors in Semantic Kernel

To create a custom AI connector in Semantic Kernel, one must extend the base classes provided, such as ChatCompletionClientBase and AIServiceClientBase. Below is a guide and example for implementing a mock AI connector:

 

Step-by-Step Walkthrough

  1. Understand the Base Classes: The foundational classes ChatCompletionClientBase and AIServiceClientBase provide necessary methods and structures for creating chat-based AI connectors.

  2. Implementing the Connector: Here’s a mock implementation example illustrating how to implement a connector without real service dependencies, ensuring compatibility with Pydantic’s expectations within the framework:

from semantic_kernel.connectors.ai.chat_completion_client_base import ChatCompletionClientBase

class MockAIChatCompletionService(ChatCompletionClientBase):
    def __init__(self, ai_model_id: str):
        super().__init__(ai_model_id=ai_model_id)

    async def _inner_get_chat_message_contents(self, chat_history, settings):
        # Mock implementation: returns dummy chat message content for demonstration.
        return [{"role": "assistant", "content": "Mock response based on your history."}]

    def service_url(self):
        return "http://mock-ai-service.com"

Usage Example

The following example demonstrates how to integrate and use the MockAIChatCompletionService in an application:

import asyncio
from semantic_kernel.contents.chat_history import ChatHistory
from semantic_kernel.connectors.ai.prompt_execution_settings import PromptExecutionSettings

async def main():
    chat_history = ChatHistory(messages=[{"role": "user", "content": "Hello"}])
    settings = PromptExecutionSettings(model="mock-model")
    
    service = MockAIChatCompletionService(ai_model_id="mock-model")
    
    response = await service.get_chat_message_contents(chat_history, settings)
    print(response)

# Run the main function
asyncio.run(main())

Conclusion

By following the revised guide and understanding the base class functionalities, developers can effectively create custom connectors within Semantic Kernel. This structured approach enhances integration with various AI services while ensuring alignment with the framework’s architectural expectations. Custom connectors offer flexibility, allowing developers to adjust implementations to meet specific service needs, such as additional logging, authentication, or modifications tailored to specific protocols. This guide provides a strong foundation upon which more complex and service-specific extensions can be built, promoting robust and scalable AI service integration.

* This blog post is created entirely by AI (including the code snippets), using Semantic Kernel’s agent framework. Multiple agents worked together to create this content by referencing the source code of SK, while ensuring the correctness, and incorporating user feedback. 

Author

0 comments