Boomi Leverages Amazon Bedrock for Faster Help Desk Responses

Tribe

About Boomi

Boomi is the intelligent integration and automation leader helping organizations around the world connect applications, data, and people to accelerate digital transformation. 

Boomi has over 20,000 customers who rely on them to connect their applications and data. The connection possibilities are truly endless. Boomi frequently connects Software as a Service (SaaS) apps to other SaaS apps, however Boomi is also often utilized to connect Systems Applications and Products (SAP) Enterprise Resource Planning (ERP) systems to Customer Relationship Management (CRM) systems, and even sometimes performs connections like linking washing machines into a centralized scheduling system in rental units.

Boomi’s Challenge

“At Boomi, we see AI in two ways. It's both a demand driver for Boomi’s services and it's a tool that can enhance how our services are delivered,” said Matt McLarty.

Matt McLarty, Chief Technology Officer at Boomi strongly affirmed that the rapidly-advancing world of AI would increase the importance of the work he and his team do. However, he also knew that it would  open up new ways in which they could do that work. With innovation so profound, McLarty and his team have been in the midst of considering the many possibilities for AI strategies. 

“It's like GenAI is a hammer. You’re holding it and you know it's powerful, so everything starts to look like nails. But, you’re looking for just the right nail that’s best suited for the GenAI hammer,” said McLarty.

McLarty and his team looked to Tribe AI to help with the process of GenAI ideation that would consider all the possibilities and then narrow the focus down to a single proof-of-concept (POC).

The selected use case was centered on better assisting Boomi customers in finding quick answers to very precise questions through the utilization of  GenAI and Boomi’s corpus of product manuals, help information, community articles, and other materials.

Proposed Solution

Tribe AI proposed an Amazon Bedrock-powered GenAI + LLM deployment that would deliver an AI assistant the groups collectively referred to as the “Help Documentation Advisor”. Utilizing AWS services to power the deployment would reduce the development workload and allow for simpler management down the line for the customer.

“GenAI can do almost everything, but where it shines is in finding the needle in the haystack,” said McLarty.

The solution was designed as an alternative for customers wishing to skip calling the help desk or perusing through the online FAQs when they had an issue or needed a question answered. Users would be able to use natural language to ask specific questions, and the AI assistant would lean on actual Boomi data like product manuals, help information, community articles, and other materials to quickly provide precise answers. Additionally, the AI Advisor would provide the person who asked the question with a link to the documentation supporting the response provided. This is a detail that McLarty and his team felt was important as customers would then have a resource to learn more, improve their understanding, or share with others if they desired. The GenAI tech that powered the AI Advisor would provide for consistent responses to customer questions and even allow for product insights that, through traditional means of research, may not have been available online previously.

Below is a sampling of the client interface that Tribe AI developed for the Boomi AI Advisor:

Tech Stack

The diagram below shows the overall architecture of the intelligent AI Advisor. Basically, the assistant is a full-stack cloud-based application working alongside the existing Boomi environment:

In this flow, the end-user asks a question to the user interface which in turns asks the backend to obtain a concise answer to the user’s question alongside precise references. To provide those, the backend consults an in-memory vector database called Chroma DB. A vector database speeds up processing and knowledge extraction in LLM systems by efficiently storing and retrieving complex data relationships. This enables faster, more accurate responses and insights from large-scale language models.

The entire architecture was deployed on Amazon Web Services (AWS) using a variety of additional AWS services. Namely:

  1. AWS Bedrock was utilized for the embedding and completions models (cohere embed-multilingual-v3 and claude-2 respectively). Bedrock offers a variety of pre-trained foundational models from leading providers, enabling the teams to quickly leverage powerful AI capabilities. Bedrock also provides an array of developer-friendly tools and APIs, facilitating easier model training, deployment, and monitoring. This support accelerated the project timelines and allowed for fast integration.
  2. AWS S3 was utilized to store the prepared data from Boomi in a cleaned and normalized representation.
  3. AWS Elastic Compute Cloud (EC2) was utilized to spawn various instances, both for the frontend and backend servers, but also for the Chroma in-memory database (used to hold the embeddings of the customer data in-memory).  

The frontend was written in React, while the backend was written using Python and Django, which allows for easy interconnectivity of various AI/ML libraries and provides a clear path to production.

Boomi’s Experience Working with Tribe

Most companies are giving thought to identifying the skills they need to be a player in the GenAI space. McLarty believes that the most important skills in the GenAI marketplace are actually a unique combination of skills. Technical skills, like being able to build an LLM, are certainly important but coupling them with business acumen, like also knowing how to apply GenAI at scale, moves a team to the next level.

“Tribe AI came to the table with an entire network of experts with real world experience working in the GenAI space. That combination of skills is scarce,” said McLarty.

Impact

“GenAI breaks the mold. The impact is so obvious. The outcomes are so much better than before,” said McLarty.

In just 4 weeks using an Amazon Bedrock-powered deployment, Tribe AI and Boomi successfully developed an AI Advisor prototype that was faster and more precise than the traditional methods customers had utilized to uncover answers to their questions. As impressive as that may sound on its own, McLarty feels like the real impact of the POC engagement was having the ability to vet the process and work of developing an AI strategy alongside Tribe AI. 

“We have many customers looking for ways to innovate with GenAI, but developing a complete AI strategy is vital to successful implementation. We now have a trusted partner in this space whom we can recommend to our customers,” said McLarty

The Future

McLarty anticipates a very agentic future for the GenAI landscape of Boomi, and the software industry at large. He expects that the idea of modular software components being AI-infused will become the de facto architecture, creating a need to help companies build and manage agents. In fact, McLarty and his team are building out an agent framework to provide a marketplace where people can find agents to provide facilities in their platform. Future endeavors may even include a registry for agents.

The first step in this future state will begin shortly with the Boomi and Tribe AI teams coming together for a whiteboarding session on how Tribe AI can help Boomi and their customers with agent building and overall AI orchestration.

“Working with Tribe AI was an awesome experience and I sincerely see a future partnership centered on our joint customers’ AI strategy development,” said McLarty.

Related Case Studies

Case Study

Insurance Company Uses ML to Optimize Pricing

Case Study

Native Instruments Leverages Amazon Bedrock for Smarter, More Intuitive Search and Discovery for Music Creators

Case Study

How Fantasmo is using machine learning to make GPS obsolete

Case Study

Kettle uses machine learning to balance risk in a changing climate

Case Study

Building a Proprietary Investment Engine Using Public Data for a Top PE Firm

Case Study

How Tribe Helped Reservoir Bring Finance Infrastructure to NFT Trading

Case Study

Orchard Applies GenAI for a Faster, Easier-to-Use Lab Reporting Interface

Case Study

Taking a Data-Driven Approach to Women's Fertility with Rita

Case Study

Tribe AI & Venture Labs: Accelerating Startups with Tailored AI Expertise

Case Study

Building a GenAI Roadmap for Educational Content Creation

Case Study

How Nota Built a Roadmap for AI-enabled Journalism with Help from Tribe

Case Study

How Tribe AI Shaped Truebit’s AI Strategy

Case Study

How Tribe AI Built a Model on GCP That Increased Security Questionnaire Auditor Efficiency by 55%

Case Study

Togal.ai powers the construction industry into the age of machine learning

Case Study

Francisco Partners Accelerates Portfolio AI Efforts with Tribe AI

Case Study

GoTo Revolutionizes Contact Center Quality Management with AI

Case Study

How Wingspan built a machine learning roadmap with Tribe AI

Case Study

Tribe AI & rbMedia: Transforming Audiobook Production with Claude & Bedrock-Powered Dramatization

Case Study

GenAI Solutions: How Bright Transformed Workforce Training with Tribe AI

Case Study

VitalSource Leans on GenAI to Reimagine Content Discoverability for Higher Ed Faculty

Case Study

How Togal AI Built the World's Fastest Estimation Software on AWS

Case Study

Accela Utilizes GenAI to Innovate 311 Help Lines with Faster & More Accurate Routing

Tribe helps organizations rapidly deploy AI solutions that have real business impact.

Close
Tribe