It’s important to understand how the application behaves and have the ability to override that behavior in runtime based on some conditions. For example, we don’t want to send malicious prompt to LLM, and we don’t want to expose more information than needed to the end users.
A couple of months ago, we added a possibility in Semantic Kernel to handle such scenarios using Filters. This feature is an improvement over previous implementation based on event handlers and today we want to introduce even more improvements based on feedback we received from SK users!
Let’s start from current version of Filters, which should allow to understand current issues. After that, we will talk about how new filters resolve the issues and see some usage examples.
Overview of current version
Here is an example of function filter, which will be executed before and after function invocation:
public class MyFilter : IFunctionFilter
{
public void OnFunctionInvoking(FunctionInvokingContext context)
{
// Method which is executed before function invocation.
}
public void OnFunctionInvoked(FunctionInvokedContext context)
{
// Method which is executed after function invocation.
}
}
First, current IFunctionFilter
interface does not support asynchronous methods. This aspect is important, because it should be possible to make additional asynchronous operations, like calling another kernel function, making a request to database or caching LLM result.
Another limitation is that it’s not possible to handle an exception that occurred during function execution and override the result. This feature would be especially useful during automatic function invocation, when LLM wants to execute a couple of functions, and in case of exception it will be possible to handle it and override the result for LLM with some default value.
While it’s good to have separate methods for each function invocation event (like in IFunctionFilter
interface), this approach has a disadvantage – these methods are not connected to each other, so in order to share some state, it needs to be saved on class level. This is not necessarily a bad thing, but let’s take an example when we want to measure how much time our function executes, and we want to start measurement in OnFunctionInvoking
method and stop it with sending results to telemetry tool in OnFunctionInvoked
method. In this case, we will be forced to set System.Diagnostics.Stopwatch
instance on class level, which is not a common pattern.
New version
We are excited to announce, that new version of Filters will resolve the problems described above.
Existing filters were renamed in order to use more specific naming. New naming works better with new type of filter, which we are going to present later in this article. New names for existing filters are the following:
• IFunctionFilter -> IFunctionInvocationFilter
• IPromptFilter -> IPromptRenderFilter
Also, the interface for function and prompt filters was changed – instead of having two separate methods, there is only one, which makes it easier to implement.
Function invocation filter
Here is an example of function invocation filter:
public class MyFilter : IFunctionInvocationFilter
{
public async Task OnFunctionInvocationAsync(FunctionInvocationContext context, Func<FunctionInvocationContext, Task> next)
{
// Perform some actions before function invocation
await next(context);
// Perform some actions after function invocation
}
}
The method is asynchronous, which makes it easy to call other asynchronous operations using async/await
pattern.
Together with context, there is also a next
delegate, which executes next filter in pipeline, in case there are multiple filters registered, or function itself. If next
delegate is not invoked, the next filters and function won’t be invoked as well. This provides more control, and it is useful in case there are some reasons to avoid function execution (e.g. malicious prompt or function arguments).
Another benefit of next
delegate is exception handling. With this approach, it’s possible to handle exceptions in .NET-friendly way using try/catch
block:
public class ExceptionHandlingFilterExample(ILogger logger) : IFunctionInvocationFilter
{
private readonly ILogger _logger = logger;
public async Task OnFunctionInvocationAsync(FunctionInvocationContext context, Func<FunctionInvocationContext, Task> next)
{
try
{
await next(context);
}
catch (Exception exception)
{
this._logger.LogError(exception, "Something went wrong during function invocation");
// Example: override function result value
context.Result = new FunctionResult(context.Result, "Friendly message instead of exception");
// Example: Rethrow another type of exception if needed
// throw new InvalidOperationException("New exception");
}
}
}
Same set of features is available for streaming scenarios. Here is an example how to override function streaming result using IFunctionInvocationFilter
:
public class StreamingFilterExample : IFunctionInvocationFilter
{
public async Task OnFunctionInvocationAsync(FunctionInvocationContext context, Func<FunctionInvocationContext, Task> next)
{
await next(context);
// In streaming scenario, async enumerable is available in context result object.
// To override data: get async enumerable from context result, override data and set new async enumerable in context result:
var enumerable = context.Result.GetValue<IAsyncEnumerable<int>>();
context.Result = new FunctionResult(context.Result, OverrideStreamingDataAsync(enumerable!));
}
private async IAsyncEnumerable<int> OverrideStreamingDataAsync(IAsyncEnumerable<int> data)
{
await foreach (var item in data)
{
// Example: override streaming data
yield return item * 2;
}
}
}
Prompt render filter
Prompt render filters have similar signature:
public class PromptFilterExample : IPromptRenderFilter
{
public async Task OnPromptRenderAsync(PromptRenderContext context, Func<PromptRenderContext, Task> next)
{
// Example: get function information
var functionName = context.Function.Name;
await next(context);
// Example: override rendered prompt before sending it to AI
context.RenderedPrompt = "Safe prompt";
}
}
This filter is executed before prompt rendering operation, and next
delegate executes next prompt filters in pipeline or prompt rendering operation itself. When next delegate is executed, it’s possible to observe rendered prompt and override it, in case we want to provide even more information (e.g. RAG scenarios) or remove sensitive information from it.
Auto function invocation filter
This is a new type of filter for automatic function invocation scenario (also known as function calling
).
This filter is similar to IFunctionInvocationFilter
, but it is executed in different scope, that has more information about execution. It means, that context model will also have more information, including:
- Function name and metadata.
- Chat history.
- List of all functions that should be executed.
- Request sequence index – identifies how many requests to LLM we already performed.
- Function sequence index – identifies how many functions we already invoked as part of single request.
- Function count – total number of functions to be executed as part of single request.
Here is a full overview of API that IAutoFunctionInvocationFilter
provides:
public class AutoFunctionInvocationFilter(ILogger logger) : IAutoFunctionInvocationFilter
{
private readonly ILogger _logger = logger;
public async Task OnAutoFunctionInvocationAsync(AutoFunctionInvocationContext context, Func<AutoFunctionInvocationContext, Task> next)
{
// Example: get function information
var functionName = context.Function.Name;
// Example: get chat history
var chatHistory = context.ChatHistory;
// Example: get information about all functions which will be invoked
var functionCalls = FunctionCallContent.GetFunctionCalls(context.ChatHistory.Last());
// Example: get request sequence index
this._logger.LogDebug("Request sequence index: {RequestSequenceIndex}", context.RequestSequenceIndex);
// Example: get function sequence index
this._logger.LogDebug("Function sequence index: {FunctionSequenceIndex}", context.FunctionSequenceIndex);
// Example: get total number of functions which will be called
this._logger.LogDebug("Total number of functions: {FunctionCount}", context.FunctionCount);
// Calling next filter in pipeline or function itself.
// By skipping this call, next filters and function won't be invoked, and function call loop will proceed to the next function.
await next(context);
// Example: get function result
var result = context.Result;
// Example: override function result value
context.Result = new FunctionResult(context.Result, "Result from auto function invocation filter");
// Example: Terminate function invocation
context.Terminate = true;
}
}
Summary
Provided examples show how to use function, prompt and auto function invocation filters. With new design, it should be possible to get more observability and have more control over function execution.
We’re always interested in hearing from you. If you have feedback, questions or want to discuss further, feel free to reach out to us and the community on the discussion boards on GitHub! We would also love your support, if you’ve enjoyed using Semantic Kernel, give us a star on GitHub.
0 comments