{"id":3078,"date":"2024-06-13T12:26:01","date_gmt":"2024-06-13T19:26:01","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/azure-sdk\/?p=3078"},"modified":"2024-06-13T12:31:21","modified_gmt":"2024-06-13T19:31:21","slug":"announcing-the-public-preview-of-the-microsoft-ai-chat-protocol-library-for-javascript","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/azure-sdk\/announcing-the-public-preview-of-the-microsoft-ai-chat-protocol-library-for-javascript\/","title":{"rendered":"Announcing the public preview of the Microsoft AI Chat Protocol library for JavaScript"},"content":{"rendered":"<p>We&#8217;re excited to announce the public preview of the Microsoft AI Chat Protocol library for building AI-powered frontends in JavaScript. The library and corresponding API specification are available on <a href=\"https:\/\/aka.ms\/aichat\">GitHub<\/a>.<\/p>\n<p>AI stands at the forefront of the tech industry&#8217;s latest innovations, causing a rapid spike in the number of technologies available for building AI applications. These technologies range from different Large Language Models (LLMs), orchestration frameworks, evaluation techniques, and design patterns for AI infrastructure such as Retrieval-Augmented Generation (RAG). While these solutions have their place in building a robust AI backend to serve users and handle complex tasks, we noticed that on the frontend, developers don&#8217;t get the same level of guidance or tooling when building applications.<\/p>\n<p>This diagram details these two separate workstreams for developers. It&#8217;s important to note that this diagram is intended to be a high-level example of how AI chat applications are designed today. Any implementation-specific details are intentionally abstracted away to clearly identify the two different development workstreams.<\/p>\n<p>One side is focused on client-side AI service consumption on the frontend in the browser, and the other is focused on the AI backend. In between the frontend and AI backend, there&#8217;s a &#8220;middle tier&#8221; endpoint that the frontend can call to access the AI backend. The diagram uses an Azure OpenAI Service-hosted model with LangChain orchestration as an example of what an AI backend can look like.<\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-overview.png\"><img decoding=\"async\" class=\"alignnone wp-image-3066 size-full\" src=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-overview.png\" alt=\"High-level overview of AI Chat Apps\" width=\"1809\" height=\"1008\" srcset=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-overview.png 1809w, https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-overview-300x167.png 300w, https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-overview-1024x571.png 1024w, https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-overview-768x428.png 768w, https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-overview-1536x856.png 1536w\" sizes=\"(max-width: 1809px) 100vw, 1809px\" \/><\/a><\/p>\n<p>If this middle-tier endpoint adheres to the <a href=\"https:\/\/aka.ms\/chatprotocol\">AI Chat Protocol API specification<\/a>, the frontend can use the library to get all the benefits you might be familiar with in the Azure SDK client libraries via Azure Core. These benefits include authentication, retries, HTTP pipeline management, and most importantly for this library, streaming responses, which can update your interfaces in real time. The consistent API surface enables the AI backend to be composed of different models and orchestration tools.<\/p>\n<p><a href=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-api-example.png\"><img decoding=\"async\" class=\"alignnone size-full wp-image-3067\" src=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-api-example.png\" alt=\"With the AI Chat Protocol\" width=\"1766\" height=\"970\" srcset=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-api-example.png 1766w, https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-api-example-300x165.png 300w, https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-api-example-1024x562.png 1024w, https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-api-example-768x422.png 768w, https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2024\/06\/05-30-24-api-example-1536x844.png 1536w\" sizes=\"(max-width: 1766px) 100vw, 1766px\" \/><\/a><\/p>\n<p>Importantly, the AI Chat Protocol API specification allows for standardization of the AI backend consumption process. That means as long as the middle-tier endpoint adheres to this specification, you can consume any AI backend in a consistent and robust way. The specification allows for flexibility around the models, orchestration tools, and architectural design patterns used on the AI backend. Additionally, this API standardization allows for a consistent way of performing evaluations on different AI backends due to a unified consumption pattern. With consistent consumption and evaluations, developers can work more on building applications and fine-tuning their AI backends for customer value instead of focusing heavily on their application architecture.<\/p>\n<h2>Library Fundamentals<\/h2>\n<p>As mentioned before, because the library is built on a similar foundation as Azure Core, the process of using the library should feel familiar to Azure SDK customers.<\/p>\n<h3>Installation<\/h3>\n<p>The AI Chat Protocol library is available on <a href=\"https:\/\/www.npmjs.com\/package\/@microsoft\/ai-chat-protocol\">npm<\/a> for you to download.<\/p>\n<pre><code class=\"language-bash\">npm i @microsoft\/ai-chat-protocol<\/code><\/pre>\n<h3>Usage<\/h3>\n<p>Once you have the library installed, you can start by creating a client object, just like other Azure SDK libraries. Next, we pass in a required endpoint when instantiating the client. This endpoint should adhere to the AI Chat Protocol API specification. If you&#8217;re using any authentication, you can also pass in your <a href=\"https:\/\/learn.microsoft.com\/javascript\/api\/@azure\/core-auth\/tokencredential?view=azure-node-latest\">TokenCredential<\/a> here.<\/p>\n<pre><code class=\"language-javascript\">import {\r\n    AIChatMessage,\r\n    AIChatProtocolClient,\r\n} from \"@microsoft\/ai-chat-protocol\";\r\n\r\nconst client = new AIChatProtocolClient(\"\/api\/chat\");<\/code><\/pre>\n<p>Once you have a client in your frontend application code, you can then send a message to your middle-tier endpoint and receive output via either the <code>getCompletion<\/code> method for a synchronous response, or via the <code>getStreamedCompletion<\/code> method for a streamed response to your frontend. When using the <code>getStreamedCompletion<\/code> method, the response can include information about the role of the sender, the session state, or the content of the message. This example uses <a href=\"https:\/\/react.dev\/reference\/react\/hooks\">React Hooks<\/a> for handling <code>sessionState<\/code>, which is an <code>unknown<\/code> type passed in when calling <code>getStreamedCompletion<\/code> in order to give the model an updated state of the conversation.<\/p>\n<pre><code class=\"language-javascript\">\r\nconst [sessionState, setSessionState] = useState&lt;unknown&gt;(undefined);\r\n\r\nconst message: AIChatMessage = {\r\n    role: \"user\",\r\n    content: \"Hello World!\",\r\n};\r\n\r\nconst result = await client.getStreamedCompletion([message], {\r\n    sessionState: sessionState,\r\n});\r\n\r\nfor await (const response of result) {\r\n    if (response.sessionState) {\r\n        \/\/do something with the session state returned\r\n    }\r\n    if (response.delta.role) {\r\n        \/\/ do something with the information about the role\r\n    }\r\n    if (response.delta.content) { \r\n        \/\/ do something with the content of the message\r\n    }\r\n}<\/code><\/pre>\n<p>If your backend adheres to the AI Chat Protocol API specification, this starter code is all you need to get started with the AI Chat Protocol library.<\/p>\n<h2>Summary<\/h2>\n<p>The Microsoft AI Chat Protocol library for JavaScript\/TypeScript provides a way for developers to consistently and easily stream AI backend responses to their applications. The AI Chat Protocol API specification creates a consistent surface for AI backend consumption and evaluations, letting developers focus on providing value in their applications instead of getting their endpoints to play nicely with one another. The AI Chat Protocol library for JavaScript\/TypeScript is available now on npm and is open-source on GitHub.<\/p>\n<p>Since the library is in public preview, any feedback would be appreciated. End-to-end samples are available on Azure and are located in the GitHub repo.<\/p>\n<ul>\n<li><a href=\"https:\/\/aka.ms\/aichat\">AI Chat Protocol Library GitHub<\/a><\/li>\n<li><a href=\"https:\/\/aka.ms\/chatprotocol\">AI Chat Protocol API Specification<\/a><\/li>\n<\/ul>\n<p>If you&#8217;re interested in watching the AI Chat Protocol library session from Microsoft Build, check out this video!<\/p>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=8n8fSR_AFro\"><img decoding=\"async\" src=\"https:\/\/img.youtube.com\/vi\/8n8fSR_AFro\/hqdefault.jpg\" alt=\"Watch the video\" \/><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>This post details the public preview release of the AI Chat Protocol library, allowing users to build robust, streamed AI frontends in JavaScript.<\/p>\n","protected":false},"author":98329,"featured_media":3070,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[786,159,705,733],"class_list":["post-3078","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-azure-sdk","tag-ai","tag-javascript","tag-sdk","tag-typescript"],"acf":[],"blog_post_summary":"<p>This post details the public preview release of the AI Chat Protocol library, allowing users to build robust, streamed AI frontends in JavaScript.<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/posts\/3078","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/users\/98329"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/comments?post=3078"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/posts\/3078\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/media\/3070"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/media?parent=3078"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/categories?post=3078"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/tags?post=3078"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}