November 19th, 2024

How Toyota uses Azure Cosmos DB to power their multi-agent AI system for enhanced productivity

This article was co-authored by Kenji Onishi, Senior Manager, Powertrain Performance Development​, Toyota Motor Corporation; Kosuke Miyasaka, Azure App Innovation Specialist; and Keisuke Hatasaki, Azure Sr. Specialist, GBB.

Executive Summary: Toyota Motor Corporation, one of the world’s largest automobile manufacturers, has adopted Azure Cosmos DB in its new multi-agent AI system designed to improve development processes. This move is aimed at enhancing efficiency and reducing the time required for developing new vehicle models in a highly competitive automotive industry.

Introduction

Toyota Motor Corporation is a global leader in the automotive industry, renowned for its innovative approaches and efficient production methods. As of the fiscal year ending March 2024, Toyota employs approximately 380,793 people and reported sales of around ¥45.095 trillion. The company is known for the Toyota Production System (TPS), which ensures efficient production and high-quality management. Toyota has also been focusing on hybrid and electrification technologies, striving to develop environmentally friendly vehicles. However, the rapid changes and intense competition in the automotive industry have necessitated more efficient and faster development of new vehicle models.

Challenges in Automotive Design Development

The design development process at Toyota involves a vast array of knowledge and expertise. Designers need to consider numerous components, specifications, past design documents, and the latest regulations, which historically required a significant amount of time to find. To innovate and expedite the development process, we at Toyota recognized the need to streamline these efforts.

We turned to generative AI to assist our designers. By leveraging Azure OpenAI Service and implementing a Retrieval-Augmented Generation (RAG) pattern, which utilizes Azure AI Search and Toyota’s internal design data, we began building an expert AI system designed to support our design teams. This expert AI system needed to prevent hallucinations and provide accurate responses to a wide range of inquiries from designers.

Adopting Azure Cosmos DB for the AI system

A key part of this solution was building a system that could seamlessly handle any number of simultaneous users, while providing the flexibility to handle conversational data with an evolving structure. It also provided the means to perform vector search on every interaction between the designers and AI agents. We adopted Azure Cosmos DB because its architecture allowed us to scale to any number of users. Also, it’s document-based schema-less design provided the flexibility needed to manage the conversational interactions between our designers and the AI Agents. This enabled Toyota to successfully develop an expert AI system that could provide designers with accurate and relevant responses based on conversation history. It also allowed us to control access to internal data based on designer permissions which enabled us to continually improve the accuracy of the system and provided a more stable and secure operational environment.

Given the diverse range of design areas within Toyota, the company has expanded this expert AI system to various specialized domains. Each agent has a distinct architecture and RAG pattern from the other. These are independently tuned to improve the accuracy and performance for each agent’s specific tasks. For example, when using data in SharePoint, RAG with Azure Cosmos DB and Azure AI Search was used. For architectures requiring operational data, vector data, and search, we used a RAG pattern with Azure Cosmos DB and Azure OpenAI Service.

Image BRK117 Ignite2024 Presentation 1

Exploring multi-agent AI systems

To drive further innovation, we started to explore the use of multi-agent systems, where multiple AIs collaborate as agents. Through trials with agent systems such as AutoGen developed by Microsoft Research and Azure OpenAI’s Function Calling, we actively investigated the use of multi-agent systems in design development to understand how this could further enhance efficiency.

Enter the “O-Beya” system, a key multi-agent initiative born out of experimentation with AI in our design development process. This initiative involves designers from various specialized fields working closely together to leverage their expertise and resources to design a vehicle model as a cohesive unit. To support this initiative, Toyota aimed to develop a multi-agent system that integrates various specialized AI systems as expert agents. These agents would enable several use cases, including content generation, document processing, summarization, and a chat assistant.

Image BRK117 Ignite2024 Presentation

Building the “O-Beya” System

There are multiple ways to configure a multi-agent system, but Toyota has focused on the capabilities of Azure Durable Functions. The multi-agent system includes four distinct agents focused on battery, motor, regulations, and system control. The prompts for each agent are defined, and these prompts play a crucial role when the multi-agent system compiles its responses. We had three main reasons for choosing Durable Functions.

First, it enables parallel processing across agents, enhancing performance. Second, it supports complex workflows, including error handling and retries. Third, it allows easy monitoring by storing the state externally, making it simpler to review. In terms of implementation, we used the fan-in/fan-out feature from Durable Functions. The Functions are triggered by a user request, activating the four agents, which are implemented as Functions and launched in parallel with fan-out. Once all parallel processes are complete, fan-in collects the results, which are then compiled by generative AI to form the response.

Each of the four agents has its own unique architecture. To improve response accuracy, we improved the RAG architecture and prompts for each agent. Additionally, by storing conversation logs in Azure Cosmos DB, the system can consider previous session data when generating responses for the next session. The main advantage of this implementation is that agents can be added easily. Since the agents operate asynchronously and in parallel, adding more agents has no negative impact on response times. The main challenge was enhancing the accuracy of the responses generated by each agent. This is addressed by refining each agent’s RAG implementation, leading to improved precision. Given that each agent has been implemented individually, updates are also easy to perform.

Image BRK117 Ignite2024 Presentation

O-Beya success so far

The success of the “O-Beya” system represents a significant milestone in Toyota’s design development innovation. The system is now available to 800 users across the company, and receives more than 100 requests per month. We have not yet measured the exact research time we’ve been able to save, but in several interviews with users, they stated that the speed of searching for information was reduced dramatically. Additionally, we plan to expand the number of agents and improve the accuracy of each agent in the future, so we’re hoping to make a big impact across the entire business.

Future innovations with Azure Cosmos DB

Toyota’s commitment to innovation in design development continues, with ongoing trials aimed at further improving our multi-agent system. These include the use of Azure Cosmos DB’s vector search capabilities for RAG improvements and leveraging Azure Cosmos DB for NoSQL to implement GraphRAG over existing data.

In summary, Toyota Motor Corporation’s adoption of Azure Cosmos DB and the development of a multi-agent AI system marks a significant advancement in our design development processes. By leveraging cutting-edge technologies and innovative approaches, we’re well-positioned to maintain a competitive edge in the automotive industry and continue delivering high-quality, environmentally-friendly vehicles.

Leave a review

Tell us about your Azure Cosmos DB experience! Leave a review on PeerSpot and we’ll gift you $50. Get started here.

About the authors

Image Profile picture of Kenji Onishi Kenji Onishi, Senior Manager, Powertrain Performance Development​, Toyota Motor Corporation Joining Toyota in 2006, Kenji worked on driving control and experimented with various self-made tools. He continues to focus on building a solid foundation for development and accelerating prototype development releases. In his current role, he works on various initiatives to integrate AI (LLMs) into development processes at Toyota.
Image kosuke miyasaka Kosuke Miyasaka, Azure App Innovation Specialist I joined Microsoft as a field engineer, where I led AI application development and the implementation of cloud-native solutions. Currently, as an App Innovation Specialist, I am focused on bringing the value of generative AI applications, generative AI-native development, and application modernization to a wide range of customers.
Picture of Keisuke Hatasaki Keisuke Hatasaki, Azure Senior Specialist, GBB With over 20 years of experience in IT platform R&D and solution development, I am currently part of Microsoft’s Global Black Belt team, which focuses on supporting advanced customers in implementing cloud-native solutions, including containers, application PaaS, integration services, and AI applications.

About Azure Cosmos DB

Azure Cosmos DB is a fully managed and serverless NoSQL and vector database for modern app development, including AI applications. With its SLA-backed speed and availability as well as instant dynamic scalability, it is ideal for real-time NoSQL and MongoDB applications that require high performance and distributed computing over massive volumes of NoSQL and vector data.

Try Azure Cosmos DB for free here. To stay in the loop on Azure Cosmos DB updates, follow us on XYouTube, and LinkedIn.

Author

I joined Microsoft as a field engineer, where I led AI application development and the implementation of cloud-native solutions. Currently, as an App Innovation Specialist, I am focused on bringing the value of generative AI applications, generative AI-native development, and application modernization to a wide range of customers.

0 comments