your123 - stock.adobe.com

MongoDB launches tools for developing generative AI apps

The database vendor's new capabilities include Atlas Stream Processing to enable real-time model updates and Atlas Search Nodes to handle advanced analytics workloads.

MongoDB on Thursday launched a series of new features designed to enable customers to develop generative AI models and applications, including the general availability of Atlas Stream Processing and an integration with Amazon Bedrock.

In addition, the vendor unveiled the MongoDB AI Applications Program (MAAP) to provide users with strategic advisory services and integrated technology for building and deploying generative AI models and applications.

MongoDB revealed the new features during MongoDB.local NYC, a user event in New York City.

Taken together, the new capabilities help MongoDB stay competitive with peers such as tech giants Google and Oracle as well as specialists including MariaDB and Couchbase, according to Kevin Petrie, an analyst at BARC U.S.

All are in a race to provide customers with the most up-to-date tools to develop generative AI assets.

"This is a pretty comprehensive set of announcements," Petrie said. "MongoDB is helping companies build GenAI applications, feed them real-time data and optimize processes such as retrieval-augmented generation to make GenAI language models more accurate."

Based in New York City, MongoDB is a database vendor whose NoSQL platform provides users with an alternative to traditional relational databases that sometimes struggle to handle the scale of modern enterprise data workloads.

Atlas is MongoDB's suite for developers, where the vendor has focused much of its attention over the past year as interest in developing AI models and applications has exploded.

Recent updates include the launch of Atlas Vector Search and Atlas Search Nodes in December 2023 and the November 2023 introduction of the MongoDB Partner Ecosystem Catalog where users can now access data and AI products being shared by the vendor's many partners.

A MAAP for AI success

Generative AI has been the dominant trend in data management and analytics over the 18 months since OpenAI's launch of ChatGPT marked a significant improvement in large language model (LLM) capabilities.

When applied to data management and analytics, generative AI has the potential to enable more users to use data to inform decisions as well as make anyone working with data more efficient.

LLMs have vast vocabularies. In addition, they can understand intent.

Therefore, when integrated with data management and analytics platforms, LLMs let users interact with the tools using true natural language processing rather than the code previously required to manage, query and analyze data.

That lets users without technical expertise use analytics tools to work with data. In addition, it helps data experts be more efficient by reducing time-consuming tasks.

As a result of generative AI's potential when combined with data management and analytics, many vendors have made generative AI a primary focus of product development, providing customers with tools such as copilots as well creating environments where customers can develop AI applications.

For example, MicroStrategy and Microsoft have added AI assistants, while Databricks and Domo are among those that provide users with development environments for AI.

MAAP is an environment for MongoDB to develop AI models and applications.

The suite includes integrations with LLMs from generative AI providers such as Anthropic and Cohere, key capabilities including vector search and retrieval-augmented generation (RAG), a secure development environment and access to experts to provide assistance as organizations start with generative AI.

Petrie noted that generative AI models are becoming a must for enterprises. But for them to succeed, language models need to be combined with analytical and operational functions that are unique to an individual enterprise and help that enterprise derive business value.

MAAP is designed to enable MongoDB customers to derive that business value and is therefore an important addition to the vendor's suite.

"MongoDB's MAAP program … helps developers optimize how they integrate language models into enterprise workflows," Petrie said. "MongoDB helps many innovative companies differentiate themselves with cloud-native, data-driven software, and this new program helps their customers capitalize on the GenAI application development wave."

The program, however, has limits, according to Sanjeev Mohan, founder and principal of SanjMo.

While MAAP includes access to LLMs from certain AI vendors, it does not include access to all LLMs. That limits model choice.

"MongoDB is giving customers a curated environment, but at the cost of not letting people use any model or any integration product of their choice," Mohan said. "It's a trade-off. MAAP is a good thing for large enterprises that want developers to experiment. But if you want freedom, MAAP limits you to its ecosystem."

Among the MongoDB partners that have joined MAAP to provide consulting services are Anthropic, AWS, Google Cloud and Microsoft.

More new capabilities

Beyond launching an environment for developing AI, MongoDB added new capabilities for Atlas.

Atlas Stream Processing, unveiled in preview in June 2023, is now generally available and is aimed at helping users build applications that combine data at rest with data in motion so they can respond to changing conditions and enable real-time decisions.

Streaming data includes information from sources such as IoT devices, customer behavior when browsing and inventory feeds. It's is an important means of helping organizations act and react with agility.

This is a pretty comprehensive set of announcements. MongoDB is helping companies build GenAI applications, feed them real-time data and optimize processes such as retrieval-augmented generation to make GenAI language models more accurate.
Kevin PetrieAnalyst, BARC U.S.

In addition to Atlas Stream Processing, MongoDB made Atlas Search Nodes generally available on AWS and Google Cloud. It is still in preview on Microsoft Azure.

Atlas Search Nodes works in concert with Atlas Vector Search and Atlas Search to provide the infrastructure for generative AI workloads. Search Nodes work independently of MongoDB's core operational database nodes so that customers can isolate their AI workloads, leading to optimized performance that can result in cost savings.

Finally, MongoDB introduced Atlas Edge Server in public preview. The tool enables users to deploy and operate applications at the edge rather than a database environment so that business users can take advantage of AI-informed insights within the flows of the work.

Each of the new Atlas capabilities on their own is a helpful addition. Their real power, however, is using them in unison, according to Mohan.

"I really like Atlas Stream Processing, Search Nodes and GenAI together," he said. "This combination is super powerful."

Stream Processing and Search Nodes are especially important for AI applications, he continued.

If streaming data can be ingested, vectorized and fed into models in near real time it can be used to inform someone during a conversation with a customer. Meanwhile, if generative AI workloads run on the same nodes as other database workloads, the entire system can suffer.

"I really like that real-time streaming piece," Mohan said. "I also really like the whole idea of Search Nodes. I don't want GenAI to suddenly slow down my bread-and-butter operational workloads."

Petrie similarly highlighted the importance of Search Nodes as an enabler of the low-latency processing needed to inform real-time decisions. Together, the new Atlas features add up to a foundation for successfully running generative AI applications, he noted.

"Most data-hungry applications -- especially GenAI applications have -- low-latency requirements," Petrie said. "These Atlas enhancements are mandatory for MongoDB customers to make their GenAI applications successful."

In addition to the new Atlas features, MongoDB launched an integration between Atlas Vector Search and Amazon Bedrock.

Bedrock is a managed service from AWS that gives customers access to foundation and LLMs from multiple AI vendors through APIs. Perhaps the main significance of the integration is that it provides joint AWS and MongoDB customers with more model choices than what is available through MAAP, according to Mohan.

Looking forward

MongoDB's latest set of new capabilities are significant in their totality, according to Petrie.

They help customers develop AI applications, feed users real-time data and include key capabilities such as RAG that make AI models more accurate. In addition, partnerships are key to providing customers with an ecosystem for AI development.

"GenAI is reinventing cloud-native software innovation," Petrie said. "These announcements show that MongoDB understands the magnitude of this industry shift and intends to play the shift to its advantage."

However, MongoDB can still do more to provide customers with all the capabilities to develop, deploy and manage AI models and applications, according to Mohan.

In particular, AI governance is an opportunity for the vendor to add new capabilities, he noted. One means could be a developer toolkit. Another could be AI agent frameworks that align development with organizational goals.

"I would like to see MongoDB embrace AI governance," Mohan said. "MongoDB has done vector search and RAG really well. Now the question is how to enable in-context learning and fine-tuning [of models]. I would like to see them launch a developer toolkit or AI agent frameworks to do more end-to-end [management]."

Eric Avidon is a senior news writer for TechTarget Editorial and a journalist with more than 25 years of experience. He covers analytics and data management.

Dig Deeper on Data management strategies