Do more in Microsoft Mesh with data and AI

Rebecca Burke-Agüero

At the beginning of the year, Microsoft Mesh became generally available for powerful 3D collaboration in Teams and in the Mesh app on PC & Quest. Since then, Mesh developers have been working on enhancing their custom Mesh experiences by infusing them with live data and scaling to more users. However, scaling Mesh events, catering to Mesh newcomers, and reasoning over large amounts of data to make informed business decisions are challenging tasks for any organization. That’s where AI can help. AI can democratize expertise by providing the right information, in the right place, at the right time, and by taking actions on your behalf. We know that organizations around the world are looking for ways to transform their businesses with AI, so we’ve started exploring what it means to unlock AI for developers building with the Mesh Toolkit.

In this blog we’ll share a first look at how we’re thinking about AI extensibility in Mesh, explain how you can get started today with AI in Mesh, and finally we’ll highlight an example of a partner who is already leveraging AI to elevate their custom Mesh experience.

AI is better in Mesh

The first AI scenario we’re unlocking with the Mesh Toolkit is support for custom AI guides (which can also be called chatbots or virtual assistants). Essentially, it’s the ability to get user input and pass it to your AI backend of choice along with your business context. Custom AI guides can address some of our customer’s biggest pain points, including:

  • Scaling Mesh experiences beyond the capacity of human-staffed events
  • Orienting Mesh newcomers to a specific space or an experience
  • Answering company or scenario-specific questions
  • Acting as a collaborator, reasoning over large amounts of data, providing additional perspectives, and democratizing expertise

You may already be familiar with some of these AI benefits from using Copilot across your M365 apps, but what makes AI in Mesh particularly powerful is Mesh’s inherent spatial context. When you place a custom AI guide in a specific location in your 3D environment, a user will automatically expect the AI interaction to be about the place or thing it is anchored to. Therefore, as a developer, you only need to pass minimal context to the AI in order to make it conversant in your scenario, and you can save time and money in the long run by leveraging smaller, faster, cheaper large language models (LLMs). For example, you don’t need to provide your AI with complicated information like an avatar’s gaze or position to know what someone might be looking at when they interact with your AI. They’re looking at the spot where you placed it!

The ability to call an AI backend with 1) user input, 2) scenario-specific business context, and 3) the spatial context inherent to Mesh is all a developer needs in order to create a useful chatbot experience, so we’re excited to make this available to you now.

Start adding AI with Mesh 201

Mesh

Ready to get started building your own custom AI guide in Mesh? Last week at Build we shared an updated Mesh 201 developer sample which will walk you through the basics. Building on the familiar wind farm scenario from Mesh 101, first we deep dive into how to add live data to your scene via WebSlates and scripting – in this case for collecting and displaying information about potential wind farm locations. Then, using the power of cloud scripting and UI that we provide, we show you how to quickly and easily build a data-driven, AI-powered chatbot to reason about this information and help you decide where to build your next windfarm. Check out the full Build session below, or jump here for the part highlighting AI specifically.

While we’re starting to support this scenario with text input as seen in this sample, support for voice input is coming soon. When available, we’ll provide an updated sample that will handle speech to text for you so you don’t have to worry about the additional infrastructure setup or cost of additional services, and your developer workflow will be the same regardless of the input type. We’ll also provide updated UI so that the security and compliance states of the interaction (When is the AI active? Who is it listening to? Can I turn it off/opt out?) are clearly communicated and handled for you.

This sample uses the Azure OpenAI deployment of GPT 3.5-turbo as our backend, and you’ll see how just a few lines of code are needed to pass our live data and shape the model’s system prompt to create a business-specific AI assistant for data-driven decision-making. However, this is just an example, and your creativity is the limit! You can use any LLM or you can even leverage the power of Copilot via Copilot extensions, for example.

And of course, building a chatbot is just the beginning for AI extensibility in Mesh. Let’s explore how one of our partners, Sulava is already creatively leveraging the current capabilities to enhance their Mesh experiences.

Spotlight on Sulava

Sulava is a Microsoft partner from Finland focused on building cutting-edge solutions for a variety of clients via generative AI and Copilot. As one of the first European countries to develop an official national AI strategy, Finland has been at the forefront of AI and Copilot, and now Sulava wants to bring the AI revolution into Mesh.

Using the capabilities available in the Mesh Toolkit today, Sulava has already incorporated AI in a variety of clever ways into their Mesh experiences. The first example is a teleporting assistant, utilizing AI to navigate large environments with ease. A user can express with natural language where they want to go, which breaks down language barriers and reduces the barrier to entry for new users navigating an unfamiliar space.

Another creative example they have built is using AI to assist in brainstorming, giving users the ability to write down their ideas and refine them into a draft project plan in seconds. This plan can then be saved into Microsoft Teams utilizing an integration built with Microsoft Power Automate and .NET Core.

Ready, set, go!

This is just the beginning for AI in Microsoft Mesh! To get started with Mesh and deep dive into the resources mentioned in this blog, start at aka.ms/MeshCreator. Stay tuned for more updates from us as we explore AI representation beyond 2D chat, unlock more sources of context, and leverage the spatial information inherent to Mesh for creating richer, AI-powered experiences.

Follow us on X (Twitter) / @Microsoft365Dev and subscribe to our YouTube channel to stay up to date on the latest developer news and announcements.

0 comments

Leave a comment

Feedback usabilla icon