Month: May 2024
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
MongoDB has made Atlas Stream Processing, a new capability it trailed last June, generally available, it announced at its MongoDB.local event in New York City.
It added Atlas Stream processing to its NoSQL Atlas database-as-a-service (DBaaS) in order to help enterprises manage real-time streaming data from multiple sources in a single interface.
The new interface that can process any kind of data and has a flexible data model, bypassing the need for developers to use multiple specialized programming languages, libraries, application programming interfaces (APIs), and drivers, while avoiding the complexity of using these multiple tools, the company said, adding that it can work with both streaming and historical data using the document model.
Atlas Search Nodes is also generally available on AWS and Google Cloud, although the capability is still in preview on Microsoft Azure. This too was showcased last year: It’s a new capability inside the Atlas database that isolates search workloads from database workloads in order to maintain database and search performance.
Users will have to wait for one new capability: Atlas Edge Server. This feature, now in preview, gives developers the capability to deploy and operate distributed applications in the cloud and at the edge, the company said. It provides a local instance of MongoDB with a synchronization server that runs on local or remote infrastructure and significantly reduces the complexity and risk involved in managing applications in edge environments, allowing applications to access operational data even with intermittent connections to the cloud.
One other MongoDB feature also entered general availability: its Vector Search integration with AWS’ generative AI service, Amazon Bedrock. This means that enterprises can use the integration to customize foundation large language models with real-time operational data by converting it into vector embeddings.
Further, enterprises can also use Agents for Amazon Bedrock for retrieval-augmented generation (RAG), the company said.
Next read this:
Article originally posted on mongodb google news. Visit mongodb google news
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
A. Today’s Tech, Channel and MSP News
Are you a thought leader in managed services or managed security services, or do you have a unique approach to an important topic that could help other MSPs or MSSPs? We’re looking for speakers like you to be part of our MSSP Alert Live program for 2024. MSSP Alert Live, the premiere cybersecurity event for MSPs and MSSPs, has opened its call for papers. (You can submit yours here.) We’re looking for speakers like you who want to share insights and experience and wisdom with peers.
If you’re an MSP that partners with public cloud service providers, it’s time for you to step up and get recognized. ChannelE2E opened its annual Top 250 Public Cloud MSPs survey. This is the survey that we use to create our annual Top 250 Public Cloud MSPs ranked list. Here’s the link to participate.
Kaseya Connect is wrapping up, and if you missed any news and updates from the conference — including Kaseya’s plans for AI and automation and MSPs’ move to the front lines of cybersecurity — check it out here and here, including reaction to Kaseya 365, both positive and negative.
We’re also looking at the exit/acquisition of The Purple Guys, which is an MSP that evolved out of the merger of Enterprise Computing Services (ECS) and My IT in January 2020, which acquired The Purple Guys and Network Technologies in 2021 and rebranded. Kian is exiting via a sale to Ntiva after The Purple Guys completed 10 acquisitions and saw significant expansion.
Pax8 has a new CEO, MongoDB sets out a roadmap (road MAAP?) for organizations looking to build generative AI apps and you should definitely watch Jessica Davis’s interview with Saas Alerts CEO Jim Lippie.
Please check out our industry Mergers and Acquisitions list, updated daily, with an archive that goes all the way back to 2020. Check out the whole thing here. Scroll down for a quick view of upcoming in-person channel and MSP events at the bottom of this post (while we spiff up our industry calendar behind the scenes. It will return!)
And please send your news tips, information, and industry chatter to senior managing editor [email protected].
1. Kian Capital Exits The Purple Guys: Kian Capital Partners is exiting its investment in The Purple Guys through a sale to Ntiva, a portfolio company of PSP Capital. Terms of the deal were not disclosed. The Purple Guys is an MSP that evolved out of the merger of Enterprise Computing Services (ECS) and My IT in January 2020, which acquired The Purple Guys and Network Technologies in 2021 and rebranded. Kian partnered with the executive leadership team to grow the company from a regional operation with a single location to a scaled platform serving hundreds of clients across eight core markets in six states. With Kian’s backing, the company completed 10 strategic acquisitions and expanded service capabilities across the central U.S.
2. Pax8’s New CEO: Pax8 has named its CTO Scott Chasin as CEO, effective immediately, replacing founder John Street in the role, the company announced. Chasin served as the CTO since 2021. Street will remain as chairman of the board of directors. Chasin last year unveiled the future of Pax8’s Marketplace, “designed to enable partners and vendors with data-driven insights and capabilities to accelerate their growth.” The announcement comes on the heels of job cuts at the company. In April Pax8 confirmed it would cut its workforce by almost 5%. Pax8 will stage its second annual Pax8 Beyond conference in Denver June 9-11.
3. SaaS Alerts Offers Free MSP Shield: Cloud security provider SaaS Alerts has debuted a new deal for non-partners. In an interview with ChannelE2E’s Jessica Davis at Kaseya Connect, Jim Lippie, CEO of SaaS Alerts, said the company will offer a free version of MSP Shield to managed service providers who aren’t SaaS Alerts partners to use within their own business for one year under a not-for-resale (NFR) license. Check out the full interview here.
4. MongoDB’s MAAP to Generative AI: Venture Beat reported that NoSQL database provider MongoDB has unveiled a program to help companies accelerate the building and deployment of AI-powered applications. The MongoDB AI Applications Program, or MAAP, includes everything organizations need to get started, from strategic advisory and professional services to an end-to-end technology stack. It brings together consultancies and providers of foundation models, cloud infrastructure, generative AI frameworks and model hosting with MongoDB Atlas to develop generative AI-based solutions for business problems. The bad news? MAAP won’t be available until July.
5. Time for Cloud MSPs to get Recognized: Are you an MSP that works with public cloud providers? It’s time to get recognized for your work. ChannelE2E opened its annual survey for the Top 250 Public Cloud MSPs. If your MSP practice includes working with a public cloud provider, this survey is for you. Participation is free. The survey closes in June. Submit your MSP today here for recognition! Our list will be released this summer.
B. In-Person MSP and Channel Partner Events
- Kaseya Connect – April 29-May 2, MGM Grand Resort, Las Vegas
- MSP GeekCon – May 19-21, Rosen Plaza, Orlando, Florida
- Dell Technologies World – May 20 -May 23, The Venetian, Las Vegas
- IT Nation Secure, (hosted by ConnectWise) June 3-5, 2024, Gaylord Palms Resort & Convention Center, Orlando, Florida
- Pax8 Beyond, June 9-11, 2024, Gaylord Rockies Convention Center, Denver, Colorado
- FLOWAutomation Conference (hosted by Rewst) – June 17- June 19, The Renaissance Tampa International Plaza Hotel, Tampa, Florida
- CompTIA ChannelCon, July 30 – August 1, 2024, Atlanta, Georgia, Hyatt Regency Atlanta
Article originally posted on mongodb google news. Visit mongodb google news
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
MongoDB on Thursday launched a series of new features designed to enable customers to develop generative AI models and applications, including the general availability of Atlas Stream Processing and an integration with Amazon Bedrock.
In addition, the vendor unveiled the MongoDB AI Applications Program (MAAP) to provide users with strategic advisory services and integrated technology for building and deploying generative AI models and applications.
MongoDB revealed the new features during MongoDB.local NYC, a user event in New York City.
Taken together, the new capabilities help MongoDB stay competitive with peers such as tech giants Google and Oracle as well as specialists including MariaDB and Couchbase, according to Kevin Petrie, an analyst at BARC U.S.
All are in a race to provide customers with the most up-to-date tools to develop generative AI assets.
“This is a pretty comprehensive set of announcements,” Petrie said. “MongoDB is helping companies build GenAI applications, feed them real-time data and optimize processes such as retrieval-augmented generation to make GenAI language models more accurate.”
Based in New York City, MongoDB is a database vendor whose NoSQL platform provides users with an alternative to traditional relational databases that sometimes struggle to handle the scale of modern enterprise data workloads.
Atlas is MongoDB’s suite for developers, where the vendor has focused much of its attention over the past year as interest in developing AI models and applications has exploded.
Recent updates include the launch of Atlas Vector Search and Atlas Search Nodes in December 2023 and the November 2023 introduction of the MongoDB Partner Ecosystem Catalog where users can now access data and AI products being shared by the vendor’s many partners.
A MAAP for AI success
Generative AI has been the dominant trend in data management and analytics over the 18 months since OpenAI’s launch of ChatGPT marked a significant improvement in large language model (LLM) capabilities.
When applied to data management and analytics, generative AI has the potential to enable more users to use data to inform decisions as well as make anyone working with data more efficient.
LLMs have vast vocabularies. In addition, they can understand intent.
Therefore, when integrated with data management and analytics platforms, LLMs let users interact with the tools using true natural language processing rather than the code previously required to manage, query and analyze data.
That lets users without technical expertise use analytics tools to work with data. In addition, it helps data experts be more efficient by reducing time-consuming tasks.
As a result of generative AI’s potential when combined with data management and analytics, many vendors have made generative AI a primary focus of product development, providing customers with tools such as copilots as well creating environments where customers can develop AI applications.
For example, MicroStrategy and Microsoft have added AI assistants, while Databricks and Domo are among those that provide users with development environments for AI.
MAAP is an environment for MongoDB to develop AI models and applications.
The suite includes integrations with LLMs from generative AI providers such as Anthropic and Cohere, key capabilities including vector search and retrieval-augmented generation (RAG), a secure development environment and access to experts to provide assistance as organizations start with generative AI.
Petrie noted that generative AI models are becoming a must for enterprises. But for them to succeed, language models need to be combined with analytical and operational functions that are unique to an individual enterprise and help that enterprise derive business value.
MAAP is designed to enable MongoDB customers to derive that business value and is therefore an important addition to the vendor’s suite.
“MongoDB’s MAAP program … helps developers optimize how they integrate language models into enterprise workflows,” Petrie said. “MongoDB helps many innovative companies differentiate themselves with cloud-native, data-driven software, and this new program helps their customers capitalize on the GenAI application development wave.”
The program, however, has limits, according to Sanjeev Mohan, founder and principal of SanjMo.
While MAAP includes access to LLMs from certain AI vendors, it does not include access to all LLMs. That limits model choice.
“MongoDB is giving customers a curated environment, but at the cost of not letting people use any model or any integration product of their choice,” Mohan said. “It’s a trade-off. MAAP is a good thing for large enterprises that want developers to experiment. But if you want freedom, MAAP limits you to its ecosystem.”
Among the MongoDB partners that have joined MAAP to provide consulting services are Anthropic, AWS, Google Cloud and Microsoft.
More new capabilities
Beyond launching an environment for developing AI, MongoDB added new capabilities for Atlas.
Atlas Stream Processing, unveiled in preview in June 2023, is now generally available and is aimed at helping users build applications that combine data at rest with data in motion so they can respond to changing conditions and enable real-time decisions.
Streaming data includes information from sources such as IoT devices, customer behavior when browsing and inventory feeds. It’s is an important means of helping organizations act and react with agility.
Kevin PetrieAnalyst, BARC U.S.
In addition to Atlas Stream Processing, MongoDB made Atlas Search Nodes generally available on AWS and Google Cloud. It is still in preview on Microsoft Azure.
Atlas Search Nodes works in concert with Atlas Vector Search and Atlas Search to provide the infrastructure for generative AI workloads. Search Nodes work independently of MongoDB’s core operational database nodes so that customers can isolate their AI workloads, leading to optimized performance that can result in cost savings.
Finally, MongoDB introduced Atlas Edge Server in public preview. The tool enables users to deploy and operate applications at the edge rather than a database environment so that business users can take advantage of AI-informed insights within the flows of the work.
Each of the new Atlas capabilities on their own is a helpful addition. Their real power, however, is using them in unison, according to Mohan.
“I really like Atlas Stream Processing, Search Nodes and GenAI together,” he said. “This combination is super powerful.”
Stream Processing and Search Nodes are especially important for AI applications, he continued.
If streaming data can be ingested, vectorized and fed into models in near real time it can be used to inform someone during a conversation with a customer. Meanwhile, if generative AI workloads run on the same nodes as other database workloads, the entire system can suffer.
“I really like that real-time streaming piece,” Mohan said. “I also really like the whole idea of Search Nodes. I don’t want GenAI to suddenly slow down my bread-and-butter operational workloads.”
Petrie similarly highlighted the importance of Search Nodes as an enabler of the low-latency processing needed to inform real-time decisions. Together, the new Atlas features add up to a foundation for successfully running generative AI applications, he noted.
“Most data-hungry applications — especially GenAI applications have — low-latency requirements,” Petrie said. “These Atlas enhancements are mandatory for MongoDB customers to make their GenAI applications successful.”
In addition to the new Atlas features, MongoDB launched an integration between Atlas Vector Search and Amazon Bedrock.
Bedrock is a managed service from AWS that gives customers access to foundation and LLMs from multiple AI vendors through APIs. Perhaps the main significance of the integration is that it provides joint AWS and MongoDB customers with more model choices than what is available through MAAP, according to Mohan.
Looking forward
MongoDB’s latest set of new capabilities are significant in their totality, according to Petrie.
They help customers develop AI applications, feed users real-time data and include key capabilities such as RAG that make AI models more accurate. In addition, partnerships are key to providing customers with an ecosystem for AI development.
“GenAI is reinventing cloud-native software innovation,” Petrie said. “These announcements show that MongoDB understands the magnitude of this industry shift and intends to play the shift to its advantage.”
However, MongoDB can still do more to provide customers with all the capabilities to develop, deploy and manage AI models and applications, according to Mohan.
In particular, AI governance is an opportunity for the vendor to add new capabilities, he noted. One means could be a developer toolkit. Another could be AI agent frameworks that align development with organizational goals.
“I would like to see MongoDB embrace AI governance,” Mohan said. “MongoDB has done vector search and RAG really well. Now the question is how to enable in-context learning and fine-tuning [of models]. I would like to see them launch a developer toolkit or AI agent frameworks to do more end-to-end [management].”
Eric Avidon is a senior news writer for TechTarget Editorial and a journalist with more than 25 years of experience. He covers analytics and data management.
Article originally posted on mongodb google news. Visit mongodb google news
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
-
MongdoDB unveiled a slew of updates for its Atlas developer platform, including integrations with AWS and Google Cloud AI tools
-
The company thinks data is the key to AI acceleration
-
It also wants to connect enterprises with the right AI partners via its MAAP approach
MongoDB thinks it holds the key that will let enterprises rev their artificial intelligence (AI) engines: data.
At its MongoDB.local NYC conference this week, the company unveiled a slew of updates designed to supercharge the development of artificial intelligence applications.
Among other things, it announced it is working with Google Cloud to integrate MongoDB knowledge into the latter’s Gemini Code Assist tool for developers; updated its MongoDB Atlas application development platform to enable the use of real-time streaming data with Stream Processing; topped up distributed application capabilities with Edge Server; and announced its MongoDB Atlas vector search capabilities are now generally available on Knowledge Bases for Amazon Bedrock.
That last announcement will essentially make it easier for developers to build retrieval augmented generation applications on Bedrock using specific enterprise data from MongoDB Atlas.
All of these announcements are part of a plan to allow enterprises to leverage data stored in MongoDB for AI development.
On a pre-brief call with journalists, MongoDB VP of Product Marketing and Strategy Scott Sanchez said the company essentially sees itself sitting at the center of the AI technology stack.
“Another way to think about it is sort of like bookends, with the GPUs and the models on one side and the apps on the other side. But just buying and investing in bookends without any books is pretty pointless, and that’s the data in the middle,” he said. “The importance of data has risen exponentially with this shift towards AI.”
“AI is really data hungry, but the data that feeds these models and helps with the inference is constantly changing,” he continued. “So, these sophisticated AI models really benefit from having a unified picture of your data.”
That’s exactly what MongoDB thinks it offers with Atlas. Launched in 2016, Sanchez said Atlas is used by nearly 50,000 customers, including more than half of Fortune 100 companies.
Thus, the enhancements to Atlas this week and the launch of the new MongoDB AI Applications Program. The latter, which the company calls MAAP for short, brings together not just MongoDB’s platform but also partners from companies offering foundational models, cloud infrastructure, security and AI consultancy services.
Greg Maxson, senior director of AI go-to-market and strategic partnerships, said MAAP was developed based on MongoDB’s experiences building generative AI applications with “hundreds” of early movers in the space.
“MAAP simplifies common first-mile challenges and streamlines the development process,” Maxson said.
He added the program “accelerates time to development,” with customers who have used the MAAP approach able to deploy generative AI applications in about six weeks.
MongoDB, of course, isn’t the only cloud database company out there. Gartner noted the company competes with services from the likes of AWS, Oracle, Microsoft, Aerospike, Couchbase, Cloudera, Cockroach Labs, MariaDB, Redis, and Snowflake, among others.
Don’t miss our upcoming Cloud-native 5G virtual event, which will feature speakers from AWS, Microsoft, and Google Cloud. Register for your free pass here.
Article originally posted on mongodb google news. Visit mongodb google news
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
On Thursday, MongoDB (MDB) received an upgrade to its Relative Strength (RS) Rating, from 63 to 75.
X
This exclusive rating from Investor’s Business Daily measures share price performance with a 1 (worst) to 99 (best) score. The rating shows how a stock’s price behavior over the last 52 weeks stacks up against all the other stocks in our database.
Over 100 years of market history reveals that the best stocks tend to have an RS Rating north of 80 in the early stages of their moves. See if MongoDB can continue to show renewed price strength and hit that benchmark.
How To Invest In Stocks In Both Bull And Bear Markets
MongoDB is not currently offering a proper buying opportunity. See if the stock goes on to form a chart pattern that could kick off a new climb.
MongoDB showed 51% earnings growth in its most recent report. Revenue gains came in at 27%. The next quarterly numbers are expected on or around May 31.
The company holds the No. 4 rank among its peers in the Computer Software-Database industry group. Confluent (CFLT) is the No. 1-ranked stock within the group.
RELATED:
IBD Stock Rating Upgrades: Rising Relative Strength
Why Should You Use IBD’s Relative Strength Rating?
How Relative Strength Line Can Help You Judge A Stock
Ready To Grow Your Investing Skills? Join An IBD Meetup Group!
Article originally posted on mongodb google news. Visit mongodb google news
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
MongoDB, Inc. (NASDAQ: MDB) today announced the MongoDB AI Applications Program (MAAP) that is designed to help organizations rapidly build and deploy modern applications enriched with generative AI technology at enterprise scale. MAAP provides customers strategic advisory, professional services, and an integrated end-to-end technology stack from MongoDB and its partners. MAAP launch partners include industry-leading consultancies and foundation model (FM), cloud infrastructure, and generative AI framework and model hosting providers that together with MongoDB help customers identify business problems that can be solved with advanced AI-powered applications. MAAP is designed to be a one-stop solution for enterprises that want to quickly and effectively embed generative AI into applications with the required technology stack and expertise in place.
“We’ve seen tremendous enthusiasm for generative AI among our customers, from agile startups to established global enterprises,” said Alan Chhabra, EVP of Worldwide Partners at MongoDB. “These organizations leverage MongoDB’s cutting-edge technology and comprehensive services to transform innovative concepts into real-world applications. However, some are still navigating how best to integrate generative AI to solve the right business problems. That’s where the MongoDB AI Applications Program comes in. This program combines our robust developer data platform, MongoDB Atlas, with our own expertise, professional services, and strategic partnerships with leaders in generative AI technologies to provide comprehensive roadmaps for organizations of all sizes to confidently adopt and implement generative AI. With the MongoDB AI Applications Program, we and our partners help customers use generative AI to enhance productivity, revolutionize customer interactions, and drive industry advancements.”
Generative AI-driven innovation has set a clear imperative for every organization—to compete, they need to modernize their applications to meet and exceed customer expectations. Organizations of every size across industries want to immediately take advantage of this technological shift, but are unsure if they have the right data strategy and technology in place that will lead to success in building, deploying, and scaling new classes of applications securely and reliably. Many organizations are stuck with ineffective ways of working with data because of legacy technology that cannot scale, while others attempt to use single-purpose, bolt-on solutions that introduce unnecessary complexity and cost. As a result, these organizations wind up trading off long-term success—because of outdated technology or add-on solutions—for short-term results with proofs of concept that cannot scale to production, offer enterprise-grade security and reliability, or provide a meaningful return on their investment.
Article originally posted on mongodb google news. Visit mongodb google news
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
Discover how companies are responsibly integrating AI in production. This invite-only event in SF will explore the intersection of technology and business. Find out how you can attend here.
MongoDB announced that its Atlas Vector Search integration with Amazon Bedrock is now available to the public. First previewed at Amazon Re:Invent last year, the tie-up allows developers to sync their foundation models and AI agents with the proprietary data stored within MongoDB, resulting in more relevant, accurate and personalized responses through the use of Retrieval Augmented Generation (RAG).
“Many businesses remain concerned about ensuring the accuracy of the outputs from AI-powered systems while also protecting their proprietary data,” Sahir Azam, MongoDB’s chief product officer, said in a statement. “We’re making it easier for joint MongoDB-[Amazon Web Services] customers to use a variety of foundation models hosted in their AWS environments to build generative AI applications that can securely use their proprietary data within MongoDB Atlas to improve accuracy and provide enhanced end-user experiences.”
Amazon Bedrock is AWS’s managed service for gen AI, providing enterprise customers a central repository for all their AI app-building needs. Among the rapidly growing collection of models available are those from Amazon, Anthropic, Cohere, Meta, Mistral, and Stable Diffusion. While using models trained by external parties can be helpful, companies may prefer to leverage their own databases. Doing so gives them greater context about their customers than the general public.
This is where MongoDB’s integration can matter. Developers can privately customize the foundation model of their choice with their own data. Afterwards, applications can be built around the newly-trained LLMs without the need for manual intervention. “You can build these gen AI applications, but unless you can put your own real-time operational data into the models, you’re going to get generic responses,” Scott Sanchez, MongoDB’s vice president of product marketing and strategy, says during a press conference.
“This integration with MongoDB makes it really easy for folks to connect the dots,” he continues. “Customers can also privately customize their large language models…with their proprietary data by converting it into vector embeddings, stored in MongoDB, for those LLMs. For example, a retailer could develop a gen AI application that uses autonomous agents to [perform] tasks like processing real-time inventory requests or handling customer returns.”
This isn’t the first collaboration between MongoDB and AWS. MongoDB’s Vector Search is available on Amazon SageMaker, and Atlas is supported by CodeWhisperer. Today’s announcement comes as MongoDB reveals other efforts to help enterprise customers create AI applications, including its AI Applications Program (MAAP).
Article originally posted on mongodb google news. Visit mongodb google news
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
Foundational models (FMs) are trained on large volumes of data and use billions of parameters. However, in order to answer customers’ questions related to domain-specific private data, they need to reference an authoritative knowledge base outside of the model’s training data sources. This is commonly achieved using a technique known as Retrieval Augmented Generation (RAG). By fetching data from the organization’s internal or proprietary sources, RAG extends the capabilities of FMs to specific domains, without needing to retrain the model. It is a cost-effective approach to improving model output so it remains relevant, accurate, and useful in various contexts.
Knowledge Bases for Amazon Bedrock is a fully managed capability that helps you implement the entire RAG workflow from ingestion to retrieval and prompt augmentation without having to build custom integrations to data sources and manage data flows.
Today, we are announcing the availability of MongoDB Atlas as a vector store in Knowledge Bases for Amazon Bedrock. With MongoDB Atlas vector store integration, you can build RAG solutions to securely connect your organization’s private data sources to FMs in Amazon Bedrock. This integration adds to the list of vector stores supported by Knowledge Bases for Amazon Bedrock, including Amazon Aurora PostgreSQL-Compatible Edition, vector engine for Amazon OpenSearch Serverless, Pinecone, and Redis Enterprise Cloud.
Build RAG applications with MongoDB Atlas and Knowledge Bases for Amazon Bedrock
Vector Search in MongoDB Atlas is powered by the vectorSearch
index type. In the index definition, you must specify the field that contains the vector data as the vector type. Before using MongoDB Atlas vector search in your application, you will need to create an index, ingest source data, create vector embeddings and store them in a MongoDB Atlas collection. To perform queries, you will need to convert the input text into a vector embedding, and then use an aggregation pipeline stage to perform vector search queries against fields indexed as the vector
type in a vectorSearch
type index.
Thanks to the MongoDB Atlas integration with Knowledge Bases for Amazon Bedrock, most of the heavy lifting is taken care of. Once the vector search index and knowledge base are configured, you can incorporate RAG into your applications. Behind the scenes, Amazon Bedrock will convert your input (prompt) into embeddings, query the knowledge base, augment the FM prompt with the search results as contextual information and return the generated response.
Let me walk you through the process of setting up MongoDB Atlas as a vector store in Knowledge Bases for Amazon Bedrock.
Configure MongoDB Atlas
Start by creating a MongoDB Atlas cluster on AWS. Choose an M10 dedicated cluster tier. Once the cluster is provisioned, create a database and collection. Next, create a database user and grant it the Read and write to any database role. Select Password as the Authentication Method. Finally, configure network access to modify the IP Access List – add IP address 0.0.0.0/0
to allow access from anywhere.
Use the following index definition to create the Vector Search index:
{
"fields": [
{
"numDimensions": 1536,
"path": "AMAZON_BEDROCK_CHUNK_VECTOR",
"similarity": "cosine",
"type": "vector"
},
{
"path": "AMAZON_BEDROCK_METADATA",
"type": "filter"
},
{
"path": "AMAZON_BEDROCK_TEXT_CHUNK",
"type": "filter"
}
]
}
Configure the knowledge base
Create an AWS Secrets Manager secret to securely store the MongoDB Atlas database user credentials. Choose Other as the Secret type. Create an Amazon Simple Storage Service (Amazon S3) storage bucket and upload the Amazon Bedrock documentation user guide PDF. Later, you will use the knowledge base to ask questions about Amazon Bedrock.
You can also use another document of your choice because Knowledge Base supports multiple file formats (including text, HTML, and CSV).
Navigate to the Amazon Bedrock console and refer to the Amzaon Bedrock User Guide to configure the knowledge base. In the Select embeddings model and configure vector store, choose Titan Embeddings G1 – Text as the embedding model. From the list of databases, choose MongoDB Atlas.
Enter the basic information for the MongoDB Atlas cluster (Hostname, Database name, etc.) as well as the ARN
of the AWS Secrets Manager secret you had created earlier. In the Metadata field mapping attributes, enter the vector store specific details. They should match the vector search index definition you used earlier.
Initiate the knowledge base creation. Once complete, synchronise the data source (S3 bucket data) with the MongoDB Atlas vector search index.
Once the synchronization is complete, navigate to MongoDB Atlas to confirm that the data has been ingested into the collection you created.
Notice the following attributes in each of the MongoDB Atlas documents:
AMAZON_BEDROCK_TEXT_CHUNK
– Contains the raw text for each data chunk.AMAZON_BEDROCK_CHUNK_VECTOR
– Contains the vector embedding for the data chunk.AMAZON_BEDROCK_METADATA
– Contains additional data for source attribution and rich query capabilities.
Test the knowledge base
It’s time to ask questions about Amazon Bedrock by querying the knowledge base. You will need to choose a foundation model. I picked Claude v2 in this case and used “What is Amazon Bedrock” as my input (query).
If you are using a different source document, adjust the questions accordingly.
You can also change the foundation model. For example, I switched to Claude 3 Sonnet. Notice the difference in the output and select Show source details to see the chunks cited for each footnote.
Integrate knowledge base with applications
To build RAG applications on top of Knowledge Bases for Amazon Bedrock, you can use the RetrieveAndGenerate API which allows you to query the knowledge base and get a response.
Here is an example using the AWS SDK for Python (Boto3):
import boto3
bedrock_agent_runtime = boto3.client(
service_name = "bedrock-agent-runtime"
)
def retrieveAndGenerate(input, kbId):
return bedrock_agent_runtime.retrieve_and_generate(
input={
'text': input
},
retrieveAndGenerateConfiguration={
'type': 'KNOWLEDGE_BASE',
'knowledgeBaseConfiguration': {
'knowledgeBaseId': kbId,
'modelArn': 'arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-3-sonnet-20240229-v1:0'
}
}
)
response = retrieveAndGenerate("What is Amazon Bedrock?", "BFT0P4NR1U")["output"]["text"]
If you want to further customize your RAG solutions, consider using the Retrieve API, which returns the semantic search responses that you can use for the remaining part of the RAG workflow.
import boto3
bedrock_agent_runtime = boto3.client(
service_name = "bedrock-agent-runtime"
)
def retrieve(query, kbId, numberOfResults=5):
return bedrock_agent_runtime.retrieve(
retrievalQuery= {
'text': query
},
knowledgeBaseId=kbId,
retrievalConfiguration= {
'vectorSearchConfiguration': {
'numberOfResults': numberOfResults
}
}
)
response = retrieve("What is Amazon Bedrock?", "BGU0Q4NU0U")["retrievalResults"]
Things to know
- MongoDB Atlas cluster tier – This integration requires requires an Atlas cluster tier of at least M10.
- AWS PrivateLink – For the purposes of this demo, MongoDB Atlas database IP Access List was configured to allow access from anywhere. For production deployments, AWS PrivateLink is the recommended way to have Amazon Bedrock establish a secure connection to your MongoDB Atlas cluster. Refer to the Amazon Bedrock User guide (under MongoDB Atlas) for details.
- Vector embedding size – The dimension size of the vector index and the embedding model should be the same. For example, if you plan to use Cohere Embed (which has a dimension size of
1024
) as the embedding model for the knowledge base, make sure to configure the vector search index accordingly. - Metadata filters – You can add metadata for your source files to retrieve a well-defined subset of the semantically relevant chunks based on applied metadata filters. Refer to the documentation to learn more about how to use metadata filters.
Now available
MongoDB Atlas vector store in Knowledge Bases for Amazon Bedrock is available in the US East (N. Virginia) and US West (Oregon) Regions. Be sure to check the full Region list for future updates.
Learn more
Try out the MongoDB Atlas integration with Knowledge Bases for Amazon Bedrock! Send feedback to AWS re:Post for Amazon Bedrock or through your usual AWS contacts and engage with the generative AI builder community at community.aws.
— Abhishek
Article originally posted on mongodb google news. Visit mongodb google news
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
Listen to this story |
The king of NoSQL MongoDB has launched the MongoDB AI Applications Program (MAAP) for enterprises to build and deploy advanced generative AI applications at scale.
Cohere, a leading enterprise AI platform, will join MongoDB under the new AI Applications Program (MAAP). It will be among the first to partner in this program, leveraging its advanced generative models, specifically the Command R series, to enhance business operations globally.
Cohere’s enterprise AI suite improved LLM applications by integrating end-to-end retrieval-augmented generation (RAG), a key component for businesses personalising AI with data. The Command R series, central to this suite, features advanced RAG techniques to prevent errors and supports tools for automating complex business processes. These models are adaptable across ten languages, catering to global operations and ensuring scalability, efficiency, and accuracy.
Additionally, the startup’s embed models enhance these capabilities, supporting enterprise search in over 100 languages, bolsters RAG applications’ robustness.
“Organisations of all sizes across industries are eager to get started with applications enriched with generative AI capabilities but many are unsure how to get started effectively,” said Alan Chhabra, EVP of Worldwide Partners at MongoDB. “The MongoDB AI Applications Program helps address this challenge, and we’re excited to have Cohere as a launch partner for the program,” he added.
Key Features of MAAP
MongoDB’s flagship product, MongoDB Atlas Vector Search, includes several generative AI application development features.
The new program is designed to assist organisations in overcoming the common challenges associated with adopting generative AI technologies. This includes navigating outdated technology and complex, costly bolt-on solutions that hinder scalability and security.
It is supported by a network of other partners, including Anthropic, Amazon Web Services, Google Cloud, and Microsoft, which ensure a wide range of expertise and resources for participating businesses.
Additionally, MAAP facilitates the development of secure and reliable AI applications by integrating trusted foundation models with robust governance controls to ensure data accuracy and safety. The program also includes personalized engagement sessions, strategic roadmapping, and hands-on support to help enterprises build, deploy, and scale their AI solutions effectively.
Article originally posted on mongodb google news. Visit mongodb google news
MongoDB Launches New Program for Enterprises to Build Modern Applications … – CXOToday.com
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
MongoDB, Inc. (NASDAQ: MDB) today announced the MongoDB AI Applications Program (MAAP) that is designed to help organizations rapidly build and deploy modern applications enriched with generative AI technology at enterprise scale. MAAP provides customers strategic advisory, professional services, and an integrated end-to-end technology stack from MongoDB and its partners. MAAP launch partners include industry-leading consultancies and foundation model (FM), cloud infrastructure, and generative AI framework and model hosting providers that together with MongoDB help customers identify business problems that can be solved with advanced AI-powered applications. MAAP is designed to be a one-stop solution for enterprises that want to quickly and effectively embed generative AI into applications with the required technology stack and expertise in place.
“We’ve seen tremendous enthusiasm for generative AI among our customers, from agile startups to established global enterprises,” said Alan Chhabra, EVP of Worldwide Partners at MongoDB. “These organizations leverage MongoDB’s cutting-edge technology and comprehensive services to transform innovative concepts into real-world applications. However, some are still navigating how best to integrate generative AI to solve the right business problems. That’s where the MongoDB AI Applications Program comes in. This program combines our robust developer data platform, MongoDB Atlas, with our own expertise, professional services, and strategic partnerships with leaders in generative AI technologies to provide comprehensive roadmaps for organizations of all sizes to confidently adopt and implement generative AI. With the MongoDB AI Applications Program, we and our partners help customers use generative AI to enhance productivity, revolutionize customer interactions, and drive industry advancements.”
Generative AI-driven innovation has set a clear imperative for every organization—to compete, they need to modernize their applications to meet and exceed customer expectations. Organizations of every size across industries want to immediately take advantage of this technological shift, but are unsure if they have the right data strategy and technology in place that will lead to success in building, deploying, and scaling new classes of applications securely and reliably. Many organizations are stuck with ineffective ways of working with data because of legacy technology that cannot scale, while others attempt to use single-purpose, bolt-on solutions that introduce unnecessary complexity and cost. As a result, these organizations wind up trading off long-term success—because of outdated technology or add-on solutions—for short-term results with proofs of concept that cannot scale to production, offer enterprise-grade security and reliability, or provide a meaningful return on their investment.
MAAP addresses these challenges by providing customers with the strategic framework, expert professional services, and technology roadmap to identify and work backward from business problems, rapidly build and iterate on solutions, and to optimize innovative generative AI applications—empowering customers to put them into production. Combined with MongoDB’s technology that enables companies to deploy generative AI applications with a unified developer data platform, MAAP delivers an end-to-end solution through its partnerships with industry-leading consultancies and FM, cloud infrastructure, and generative AI framework and model hosting providers—including Anthropic, Anyscale, Amazon Web Services (AWS), Cohere, Credal.ai, Fireworks.ai, Google Cloud, gravity9, LangChain, LlamaIndex, Microsoft Azure, Nomic, PeerIslands, Pureinsights, and Together AI. MAAP offers customers the technology, full-service engagement, and expert support required to:
Develop end-to-end strategies and roadmaps to build, deploy, and scale generative AI applications with hands-on support: Engagement with MAAP begins with highly personalized deep dives. MongoDB Professional Services evaluates an organization’s current technology stack and works with customers to identify business problems to work backward from. MongoDB Professional Services and consultancy partners then develop strategic roadmaps to rapidly prototype the required architecture, validate that initial results meet customer expectations, and then optimize fully built applications that customers can evolve for use in production. Customers can continue receiving support from MongoDB Professional Services to develop new generative AI features depending on their needs.
Build high-performing generative AI applications that are secure, reliable, and trustworthy: Enterprises must trust that new technology being deployed throughout their organization and in customer-facing applications will behave as expected and will not inadvertently expose sensitive data. MAAP offers a curated selection of leading FMs that have been designed for safety, trustworthiness, and usefulness by MAAP partners. By combining these FMs with techniques like retrieval-augmented generation (RAG) using proprietary data with robust governance controls, customers can reduce model hallucinations by controlling exactly what data is provided to FMs to give them the required context that improves accuracy. Customers can also use optimized fine-tuning and inference services through MAAP partners for domain-specific use cases and fast AI model response times using models from Anthropic, Cohere, Meta, Mistral, OpenAI, and more. With MAAP, customers get the generative AI reference architectures, integrated technology, and prescriptive guidance needed for their use cases through hands-on professional services to help them build secure, high-performing applications that function as intended.
Engage in generative AI jump-start sessions with industry experts: Organizations not yet ready to adopt generative AI at scale can customize their MAAP engagement to provide their teams prototyping sessions in a secure, private, sandbox environment. For example, a customized engagement with MAAP can include an organization’s strategy, operations, IT, or software development teams—or a combination of several teams—where an expert-led session brings together different perspectives and aims to identify an internal business challenge that generative AI can be applied to. MongoDB Professional Services can then lead a hackathon to collaboratively build a solution and test its effectiveness for internal use cases. MAAP provides the education, resources, and technology needed to quickly build a tangible solution to see generative AI in action solving a specific business problem.
MongoDB customers welcome MAAP to help them build applications enriched with generative AI
ACI Worldwide powers mission-critical real-time payment software solutions for thousands of organizations around the world. Many of the world’s largest financial institutions rely on ACI to process and manage digital payments, power omni-commerce payments, and to manage fraud and risk. “MongoDB and PeerIslands have been working closely with ACI Worldwide to modernize and transform our critical business applications, and the results in this initial phase have been impressive,” said Abe Kuruvilla, CTO at ACI Worldwide. “By relying on MongoDB Atlas, and by leveraging MongoDB’s AI solutions, we’ve been able to speed up our modernization process and to reduce the burden of database management faced by our developers, allowing them to focus on innovation. ACI Worldwide looks forward to continuing to work closely with MongoDB, PeerIslands, and MAAP partners through the new MongoDB AI Applications Program.”
Arc XP is the SaaS content platform built to power sophisticated storytelling and digital experiences for customers with big stories to tell. “We needed to come up with a strategy that enabled us to keep pace with our growth while maintaining pace of feature development. That’s why we chose to go to MongoDB Atlas,” said Joe Croney, CTO at Arc XP. “Enabling developers to build more value for customers is ultimately what we’re here to do, and that’s why MongoDB has been so effective as a partner. The story of Arc XP is one of hypergrowth, and MongoDB Atlas has been fantastic in enabling our business transformation. Our recent advancements, particularly in AI-driven features, have thrived thanks to our deep engagement with MongoDB’s Professional Services, Technology, and Account teams. These partnerships have been crucial, allowing Arc XP engineers to innovate using MongoDB’s cutting-edge technologies, including MongoDB Atlas Vector Search.”
Scalestack’s AI-powered, all-in-one go-to-market data orchestration and activation platform helps customers spend less time manually researching and organizing data, and more time executing. “Scalestack’s mission is to help organizations unlock sales productivity, and our relationship with MongoDB and its partners has been integral to that,” said Elio Narciso Co-founder and CEO at Scalestack. “Scalestack’s Spotlight AI copilot—which helps sales representatives prioritize and choose their targets, as well as draft personalized emails and scripts based on unstructured data—uses MongoDB Atlas Vector Search’s RAG abilities to perform quick searches over large datasets using vector similarity. MongoDB’s partnerships and integrations have helped us fine-tune AI models to specific use cases, enhancing Spotlight’s performance. We’re excited to continue innovating on behalf of customers by working with MongoDB and its partners through the new MongoDB AI Applications Program.”
MongoDB partners joining MAAP to provide strategic consulting, technology, and expertise to customers
Anthropic, a world-leading AI company, pioneers trusted, safe, and reliable AI solutions for enterprises. Its advanced Claude 3 model family delivers valuable AI assistance in data analysis, content creation, customer support, project management, and strategic decision-making across every team. “Deploying accurate and reliable AI solutions is a top priority for enterprises today. Through our participation in MongoDB’s AI Applications Program, more businesses will harness the power of Anthropic’s industry-leading Claude 3 model family to develop tailored solutions that drive real business impact,” said Kate Jensen, Head of Revenue at Anthropic. “By integrating Anthropic’s powerful foundation models with MongoDB’s cutting-edge technology and expertise, more enterprises in regulated industries such as financial services and healthcare will confidently deploy customized AI solutions that meet their standards on safety and security.”
Since 2006, Amazon Web Services has been the world’s most comprehensive and broadly adopted cloud. “Generative AI is a transformative technology that has the promise to change how organizations of all sizes across industries conduct business, optimize their operations, and interact with their customers,” said Chris Grusz, Managing Director of Technology Partnerships at AWS. “AWS and MongoDB have been working together to integrate services for several years, helping customers leverage the strength of both companies to help developers innovate more rapidly. The MongoDB AI Applications Program will help build on the impact we’ve already been helping customers achieve.”
Google Cloud is the new way to the cloud, providing AI, infrastructure, developer, data, security, and collaboration tools built for today and tomorrow. “MongoDB and Google Cloud have helped joint customers enhance their businesses with AI for many years, and we’re excited to extend our collaboration with support for the MongoDB AI Applications Program,” said Stephen Orban, VP of Migrations, ISVs, and Marketplace at Google Cloud. “Together, we will provide enterprises with the technical resources, frameworks, and governance tools required to more effectively build and deploy generative AI applications on Google Cloud’s AI-optimized infrastructure.”
Microsoft enables digital transformation for the era of an intelligent cloud and an intelligent edge to empower every person and every organization on the planet to achieve more. “Many customers want to get up and running quickly with generative AI but are sometimes overwhelmed by the many choices of technologies and the expertise required to build modern applications,” said Alvaro Celis, Vice President of Global ISV Commercial Solutions at Microsoft. “The new MongoDB AI Applications Program will help organizations reduce the complexity and time it takes to build applications enriched with generative AI by offering them integrated technology solutions and hands-on support. We are excited to be a launch partner for the program to help our joint customers with MongoDB take advantage of this game-changing technology.”
Organizations can learn more about MAAP by visiting the program website.
This announcement and more will be featured in the MongoDB.local NYC keynote delivered by MongoDB President and CEO Dev Ittycheria and Chief Product Officer Sahir Azam, which can be viewed via live-stream here beginning at 10:00am ET on May 2.
About MongoDB Atlas
MongoDB Atlas is the leading multi-cloud developer data platform that accelerates and simplifies building modern applications with a highly flexible, performant, and globally distributed operational database at its core. By providing an integrated set of data and application services in a unified environment, MongoDB Atlas enables development teams to quickly build with the security, performance, and scale modern applications require. Millions of developers and tens of thousands of customers across industries—including Cisco, GE Healthcare, Intuit, Toyota Financial Services, and Verizon—rely on MongoDB Atlas every day to innovate more quickly, efficiently, and cost-effectively for virtually every use case across the enterprise. To get started with MongoDB Atlas, visit mongodb.com/atlas.
About MongoDB
Headquartered in New York, MongoDB’s mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. Built by developers, for developers, MongoDB’s developer data platform is a database with an integrated set of related services that allow development teams to address the growing requirements for today’s wide variety of modern applications, all in a unified and consistent user experience. MongoDB has tens of thousands of customers in over 100 countries. The MongoDB database platform has been downloaded hundreds of millions of times since 2007, and there have been millions of builders trained through MongoDB University courses. To learn more, visit mongodb.com.
Article originally posted on mongodb google news. Visit mongodb google news