From April 9 to April 11, Las Vegas became the center of the tech world, as Google Cloud Next ’24 took over the Mandalay Bay Convention Center—and the convention’s spotlight shined brightest on gen AI.
Check out our AI resource page to learn more about building AI-powered apps with MongoDB.
Between MongoDB’s big announcements with Google Cloud (which included an expanded collaboration to enhance building, scaling, and deploying GenAI applications using MongoDB Atlas Vector Search and Vertex AI), industry sessions, and customer meetings, we offered in-booth lightning talks with leaders from four MongoDB partners—LangChain, LlamaIndex, Patronus AI, and Unstructured—who shared valuable insights and best practices with developers who want to embed AI into their existing applications or build new-generation apps powered by AI.
Developing next-generation AI applications involves several challenges, including handling complex data sources, incorporating structured and unstructured data, and mitigating scalability and performance issues in processing and analyzing them. The lightning talks at Google Cloud Next ‘24 addressed some of these critical topics and presented practical solutions.
One of the most popular sessions was from Harrison Chase, co-founder and CEO at LangChain, an open-source framework for building applications based on large language models (LLMs). Harrison provided tips on fixing your retrieval-augmented generation (RAG) pipeline when it fails, addressing the most common pitfalls of fact retrieval, non-semantic components, conflicting information, and other failure modes. Harrison recommended developers use LangChain templates for MongoDB Atlas to deploy RAG applications quickly.
Meanwhile, LlamaIndex—an orchestration framework that integrates private and public data for building applications using LLMs—was represented by Simon Suo, co-founder and CTO, who discussed the complexities of advanced document RAG and the importance of using good data to perform better retrieval and parsing. He also highlighted MongoDB’s partnership with LlamaIndex, allowing for ingesting data into the MongoDB Atlas Vector database and retrieving the index from MongoDB Atlas via LlamaParse and LlamaCloud.
Harrison Chase – LangChain
Simon Suo – LlamaIndex
Guillaume Nozière shed light on common mistakes made by RAG applications (such as hallucinations) and challenges related to catching those reliably at scale. His insights come from his work as a forward-deployed engineer at Patronus AI, an automated LLM evaluation platform that helps companies deploy gen AI applications confidently. While presenting, Guillaume also reiterated how critical it is for RAG systems to be built on top of reliable data platforms such as MongoDB Atlas.
Last but not least, Unstructured’s Partnerships Manager Andrew Zane discussed the common issues that the heterogeneity of file types and document layouts can create when getting data LLM-ready. He also noted how Unstructured, a platform that connects any type of enterprise data with LLMs, can solve these issues and even enhance retrieval performance when used alongside MongoDB Atlas Vector Search.
Guillaume Nozière – Patronus AI
Andrew Zane – Unstructured
Amidst so many booths, activities, and competing programming, a range of developers from across industries showed up to these insightful sessions, where they could engage with experts, ask questions, and network in a casual setting. They also learned how our AI partners and MongoDB work together to offer complementary solutions to create a seamless gen AI development experience.
We are grateful for LangChain, LlamaIndex, Patronus AI, and Unstructured’s ongoing partnership. We look forward to expanding our collaboration to help our joint customers build the next generation of AI applications.
To learn more about building AI-powered apps with MongoDB, check out our AI Resources Hub and stop by our Partner Ecosystem Catalog to read about our integrations with these and other AI partners.
Source: Read More