Unit Testing with Semantic Kernel

Sophia Lagerkrans-Pandey

Dmytro Struk

Hi all,

Testing is an integral part of the software development process. Unit testing allows to test your functionality in isolation. This usually means that instead of performing real work (e.g. sending HTTP request to LLM), it needs to be replaced with something that only simulates the work and return some predefined result, which will be used in testing class. As Semantic Kernel developers, we want to make sure that it’s easy to write unit tests for classes that use main library components – Kernel, Plugins and Functions.

This article shows how to test the functionality in .NET applications which use Semantic Kernel. For demonstration purposes, we will use xUnit as test framework and Moq as mocking library. 

Let’s take a common scenario, where there is a service that does some work, and Kernel object is a dependency of that service: 

public class MyService(Kernel kernel) 
{ 
    private readonly Kernel _kernel = kernel; 
 
    public async Task<FunctionResult> DoWorkAsync(KernelFunction function) 
        => await _kernel.InvokeAsync(function); 
 
    public async Task<FunctionResult> DoWorkAsync(string pluginName, string functionName) 
        => await _kernel.InvokeAsync(pluginName, functionName); 
 
    public async Task<FunctionResult> DoWorkAsync(string prompt) 
        => await _kernel.InvokePromptAsync(prompt); 
}

In real-world applications, such services will probably contain more logic, but for simplicity, our service example will call available Kernel methods and return its result. 

It’s always possible to create a wrapper around Kernel (in similar way how  MyService looks now) and use it for mocking and testing purposes, but this article focuses on testing Kernel and functions directly. 

Kernel is a container that holds services and plugins, which means that we don’t want to mock Kernel directly. Instead, we want to mock its services and plugins. As presented in MyService class, Kernel has different ways how to execute a function and we will cover them in more detail. Next examples include: 

  1. Plugins – mocking calls to plugin functions. 
  2. Services – mocking calls to an LLM. 
  3. Streaming – additional considerations when using streaming with both plugin functions and LLM.

InvokeAsync with KernelFunction 

When we do kernel.InvokeAsync(function), InvokeAsync method performs function.InvokeAsync(...), which means that we are interested in mocking  KernelFunction behaviour. 

At the moment, KernelFunction can be either KernelFunctionFromMethod, which is some method in C# code, or KernelFunctionFromPrompt, which is constructed using prompt template and which will perform a request to LLM. The difference between these 2 types of KernelFunction defines how they need to be mocked. 

In case when Kernel needs to invoke KernelFunctionFromMethod, we can simply define a method in our test that will return desired result. For this purpose, we can use KernelFunctionFactory, which is static helper class to create different types of functions. Method KernelFunctionFactory.CreateFromMethod accepts delegate, so we can define our function with desired result as lambda expression, local function or pass existing function as a parameter. For simplicity, we will use lambda expression. 

Our first method in MyService accepts KernelFunction, so we will create our test function and pass it to the service, where it will be invoked. Here is how the test will look like: 

[Fact] 
public async void DoWorkWithFunction() 
{ 
    // Arrange 
    var function = KernelFunctionFactory.CreateFromMethod(() => "Function result value"); 
    var kernel = new Kernel(); 
 
    var service = new MyService(kernel); 
 
    // Act 
    var result = await service.DoWorkAsync(function); 
 
    // Assert 
    Assert.Equal("Function result value", result.ToString()); 
}

InvokeAsync with Function and Plugin names 

Kernel holds a collection of plugins, so anytime we want to change Kernel behaviour in our tests, we need to pass it with registered plugins that don’t perform a real work and contain some mock behaviour instead. 

Our second method in MyService doesn’t use KernelFunction, it accepts Function and Plugin names. This means, that when we initialize our Kernel, we need to register some plugin with test behaviour, which will satisfy our test case. 

To create test plugin, we can use static KernelPluginFactory class, which will help to create our plugin. This class contains methods like KernelPluginFactory.CreateFromType<T> and KernelPluginFactory.CreateFromObject, which are useful when our KernelFunction is defined in separate class. These methods will use reflection to get a metadata about plugin class and create KernelPlugin instance. 

For testing scenarios, there is a simpler way how to initialize plugin –KernelPluginFactory.CreateFromFunctions method. It accepts plugin name and enumeration of KernelFunction instances. Based on previous example, we already know how to create a function with mocked behaviour using KernelFunctionFactory, which means that we can combine two approaches together, which will result in following test: 

[Fact] 
public async Task DoWorkWithFunctionName() 
{ 
    // Arrange 
    var function = KernelFunctionFactory.CreateFromMethod(() => "Function result value", "MyFunction"); 
    var plugin = KernelPluginFactory.CreateFromFunctions("MyPlugin", [function]); 
    var plugins = new KernelPluginCollection([plugin]); 
 
    var kernel = new Kernel(plugins: plugins); 
 
    var service = new MyService(kernel); 
 
    // Act 
    var result = await service.DoWorkAsync("MyPlugin", "MyFunction"); 
 
    // Assert 
    Assert.Equal("Function result value", result.ToString()); 
} 

As soon as we create our KernelPlugin, we can initialize Kernel and pass instance of KernelPluginCollection, which will hold our test plugin with mocked behaviour. Then, we inject kernel instance to the service, and it will use test function to return configured result. 

InvokePromptAsync 

In previous examples, we mocked KernelFunctionFromMethod behaviour, but Kernel can also hold KernelFunctionFromPrompt functions. As already described, these functions will perform requests to LLM, which means that most probably HttpClient will be involved and in this case more mocking logic will be required. 

When we do kernel.InvokePromptAsync(...), we just call an extension method that will create instance of KernelFunctionFromPrompt and immediately invoke it. 

KernelFunctionFromPrompt during invocation will render prompt using configured prompt template and pass that prompt to service, which will be responsible to send a request to LLM. KernelFunctionFromPrompt will try to find an instance of IChatCompletionService or ITextGenerationService service to perform the request. Which means that we can mock IChatCompletionService and return desired result. We also need to register IChatCompletionService in Kernel, because that’s the place where our function will look for it. 

To define test behaviour of IChatCompletionService, it’s possible to implement that interface with fake class, but easier way would be to use mocking library of your choice. As soon as we have mocked behaviour in place, we need to register it in Kernel and invoke our prompt function. Here is an example how the test could look like: 

[Fact] 
public async Task DoWorkWithPrompt() 
{ 
    // Arrange 
    var mockChatCompletion = new Mock<IChatCompletionService>(); 
    mockChatCompletion 
        .Setup(x => x.GetChatMessageContentsAsync( 
            It.IsAny<ChatHistory>(), 
            It.IsAny<PromptExecutionSettings>(), 
            It.IsAny<Kernel>(), 
            It.IsAny<CancellationToken>())) 
        .ReturnsAsync([new ChatMessageContent(AuthorRole.Assistant, "AI response")]); 
 
    var kernelBuilder = Kernel.CreateBuilder(); 
    kernelBuilder.Services.AddSingleton(mockChatCompletion.Object); 
 
    var kernel = kernelBuilder.Build(); 
    var service = new MyService(kernel); 
 
    // Act 
    var result = await service.DoWorkAsync("Prompt to AI"); 
 
    // Assert 
    Assert.Equal("AI response", result.ToString()); 
} 

As alternative, if AI connector accepts HttpClient as parameter, it’s also possible to define test behaviour and mock HttpClient. 

Testing streaming scenarios 

In case of streaming, the tests will look very similar to previous examples, with small differences in places where we define our test response and mock IChatCompletionService. 

Let’s rewrite MyService to call Kernel streaming methods: 

public class MyService(Kernel kernel) 
{ 
    private readonly Kernel _kernel = kernel; 
 
    public IAsyncEnumerable<T> DoWorkStreaming<T>(KernelFunction function) 
        => _kernel.InvokeStreamingAsync<T>(function); 
 
    public IAsyncEnumerable<T> DoWorkStreaming<T>(string pluginName, string functionName) 
        => _kernel.InvokeStreamingAsync<T>(pluginName, functionName); 
 
    public IAsyncEnumerable<T> DoWorkStreaming<T>(string prompt) 
        => _kernel.InvokePromptStreamingAsync<T>(prompt); 
} 

Based on this service, the tests will look like following: 

[Fact] 
public async Task DoWorkStreamingWithFunction() 
{ 
    // Arrange 
    async IAsyncEnumerable<int> TestMethod() 
    { 
        yield return 1; 
        yield return 2; 
        yield return 3; 
    } 
 
    var function = KernelFunctionFactory.CreateFromMethod(TestMethod); 
 
    var kernel = new Kernel(); 
 
    var service = new MyService(kernel); 
 
    // Act 
    var result = new List<int>(); 
    await foreach (var item in service.DoWorkStreaming<int>(function)) 
    { 
        result.Add(item); 
    } 
 
    // Assert 
    Assert.Equal([1, 2, 3], result); 
} 
 
[Fact] 
public async Task DoWorkWithFunctionName() 
{ 
    // Arrange 
    async IAsyncEnumerable<int> TestMethod() 
    { 
        yield return 1; 
        yield return 2; 
        yield return 3; 
    } 
 
    var function = KernelFunctionFactory.CreateFromMethod(TestMethod); 
    var plugin = KernelPluginFactory.CreateFromFunctions("MyPlugin", [function]); 
    var plugins = new KernelPluginCollection([plugin]); 
 
    var kernel = new Kernel(plugins: plugins); 
 
    var service = new MyService(kernel); 
 
    // Act 
    var result = new List<int>(); 
    await foreach (var item in service.DoWorkStreaming<int>(function)) 
    { 
        result.Add(item); 
    } 
 
    // Assert 
    Assert.Equal([1, 2, 3], result); 
} 
 
[Fact] 
public async Task DoWorkWithPrompt() 
{ 
    // Arrange 
    var mockChatCompletion = new Mock<IChatCompletionService>(); 
    mockChatCompletion 
        .Setup(x => x.GetStreamingChatMessageContentsAsync( 
            It.IsAny<ChatHistory>(), 
            It.IsAny<PromptExecutionSettings>(), 
            It.IsAny<Kernel>(), 
            It.IsAny<CancellationToken>())) 
        .Returns(new List<StreamingChatMessageContent>() { new(AuthorRole.Assistant, "AI response") }.ToAsyncEnumerable()); 
 
    var kernelBuilder = Kernel.CreateBuilder(); 
    kernelBuilder.Services.AddSingleton(mockChatCompletion.Object); 
 
    var kernel = kernelBuilder.Build(); 
    var service = new MyService(kernel); 
 
    // Act 
    var result = new List<string>(); 
    await foreach (var item in service.DoWorkStreaming<string>("Prompt to AI")) 
    { 
        result.Add(item); 
    } 
 
    // Assert 
    Assert.Equal(["AI response"], result); 
}

In case of streaming, we still use KernelFunctionFactory and KernelPluginFactory as helper classes to initialize test plugins and functions. 

The only difference is the result of mocked functions – instead of returning value directly, we return IAsyncEnumerable<T>. We also mock IChatCompletionService differently – instead of mocking GetChatMessageContentsAsync method, we mock GetStreamingChatMessageContentsAsync. We also use ToAsyncEnumerable method from System.Linq.Async package to convert our result list to IAsyncEnumerable in easier way. 

Summary 

Provided examples show how to write unit tests for services which use Kernel as a dependency. Although these examples focus on unit testing, similar approaches can be used for integration testing. 

We’re always interested in hearing from you. If you have feedback, questions or want to discuss further, feel free to reach out to us and the community on the discussion boards on GitHub! We would also love your support, if you’ve enjoyed using Semantic Kernel, give us a star on GitHub.

0 comments

Leave a comment

Feedback usabilla icon