Mobile Monitoring Solutions

Search
Close this search box.

MongoDB Atlas Vector Search integration now GA with Amazon Bedrock – VentureBeat

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

Discover how companies are responsibly integrating AI in production. This invite-only event in SF will explore the intersection of technology and business. Find out how you can attend here.


MongoDB announced that its Atlas Vector Search integration with Amazon Bedrock is now available to the public. First previewed at Amazon Re:Invent last year, the tie-up allows developers to sync their foundation models and AI agents with the proprietary data stored within MongoDB, resulting in more relevant, accurate and personalized responses through the use of Retrieval Augmented Generation (RAG).

“Many businesses remain concerned about ensuring the accuracy of the outputs from AI-powered systems while also protecting their proprietary data,” Sahir Azam, MongoDB’s chief product officer, said in a statement. “We’re making it easier for joint MongoDB-[Amazon Web Services] customers to use a variety of foundation models hosted in their AWS environments to build generative AI applications that can securely use their proprietary data within MongoDB Atlas to improve accuracy and provide enhanced end-user experiences.”

Amazon Bedrock is AWS’s managed service for gen AI, providing enterprise customers a central repository for all their AI app-building needs. Among the rapidly growing collection of models available are those from Amazon, Anthropic, Cohere, Meta, Mistral, and Stable Diffusion. While using models trained by external parties can be helpful, companies may prefer to leverage their own databases. Doing so gives them greater context about their customers than the general public.

This is where MongoDB’s integration can matter. Developers can privately customize the foundation model of their choice with their own data. Afterwards, applications can be built around the newly-trained LLMs without the need for manual intervention. “You can build these gen AI applications, but unless you can put your own real-time operational data into the models, you’re going to get generic responses,” Scott Sanchez, MongoDB’s vice president of product marketing and strategy, says during a press conference.

VB Event

The AI Impact Tour – San Francisco

Join us as we navigate the complexities of responsibly integrating AI in business at the next stop of VB’s AI Impact Tour in San Francisco. Don’t miss out on the chance to gain insights from industry experts, network with like-minded innovators, and explore the future of GenAI with customer experiences and optimize business processes.


Request an invite

“This integration with MongoDB makes it really easy for folks to connect the dots,” he continues. “Customers can also privately customize their large language models…with their proprietary data by converting it into vector embeddings, stored in MongoDB, for those LLMs. For example, a retailer could develop a gen AI application that uses autonomous agents to [perform] tasks like processing real-time inventory requests or handling customer returns.”

This isn’t the first collaboration between MongoDB and AWS. MongoDB’s Vector Search is available on Amazon SageMaker, and Atlas is supported by CodeWhisperer. Today’s announcement comes as MongoDB reveals other efforts to help enterprise customers create AI applications, including its AI Applications Program (MAAP).

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.