Why AI needs a new approach to unstructured, fast-changing data: MongoDB Exec

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

Why AI needs a new approach to unstructured, fast-changing data: MongoDB Exec

<!—->


Loading...
Why AI needs a new approach to unstructured, fast-changing data: MongoDB Exec

As Indian enterprises push forward with digital transformation, data modernization has become a central focus. Companies are moving to cloud-native systems, using real-time data, and beginning to build AI-driven applications. 

MongoDB is working across the ecosystem, from startups and software vendors to large enterprises, to support this shift.  

In this conversation, Sachin Chawla, VP for India & ASEAN at MongoDB, talks about how Indian enterprises are approaching data transformation, the challenges they face, and how MongoDB is adapting its strategy to meet these changing needs. Edited Excerpts:  

What’s fundamentally changing in how Indian enterprises approach data and digital transformation, and how is that shaping your strategy in the region? 

India is going through a significant phase in technology. While we’ll get to the challenges shortly, it’s worth highlighting the strong momentum we’re seeing in modern application development across the ecosystem. 

This includes the startup space, where innovation is active, from early-stage companies like RFPIO to large-scale players like Zepto and Zomato. There’s also a robust ISV ecosystem here, which operates quite differently compared to markets like the US or EMEA. For instance, many Indian banks rely heavily on ISVs, often referred to as “bank tech”, for their software needs. Companies like CRMnext, Intellect AI, and Lentra are key examples, and many of them use MongoDB. 

Beyond startups and ISVs, significant digital transformation is happening in large enterprises as well. Take Tata, for example, the Tata Neu app runs on MongoDB. So overall, there’s a lot of progress and activity in the ecosystem. 

Now, on to the challenges. Broadly, they’re similar to those faced globally and can be grouped into three areas, first, improving developer productivity. Every organization is focused on how to help developers move faster and be more efficiently. Second, building modern AI applications. There’s growing pressure, both from the ecosystem and from leadership, to deliver AI-driven solutions. 

Third, modernizing legacy applications. Many existing systems were built over years and aren’t designed to meet today’s demands. Users expect immediate, responsive digital experiences, and older systems can’t keep up. These are the key priorities: boosting developer productivity, adopting AI, and modernizing legacy systems. 

What are the biggest misconceptions Indian enterprises still have about modern databases, and how do these hold back their digital transformation? 

First, some organizations treat modernization as simply moving their existing systems from on-premises to the cloud. But lifting and shifting the stack without changing the application, the underlying infrastructure, or the database doesn’t actually modernize anything. It’s still the same application, just running in a different environment, and it won’t deliver new value just by being in the cloud. 

Second, there’s the idea of using the best purpose-built database for each use case. In practice, this often leads to an overload of different databases within one organization. While each one might be optimal for its specific function, managing a large variety becomes a challenge. You end up spending more time and resources maintaining the system than actually innovating with it. 

Third, when it comes to AI, many organizations lack a clear strategy. They often start building without clearly defined use cases or objectives, just reacting to pressure to “do something with AI.” Without a focused plan, it’s hard to deliver meaningful outcomes. 

Which industries in India are making the boldest or most unexpected moves in digital transformation or data modernization, and why? 

Every sector is adopting AI in its own way. Tech startups and digital-native companies tend to make faster, more visible moves, but even large enterprises are accelerating their efforts. 

For example, we recently partnered with Intellect AI, an independent software vendor serving the global banking sector. They aimed to lead the way in building a multi-agent platform for banking clients to automate and augment operations in areas like compliance and risk, critical functions for many institutions. 

We helped them develop this platform using MongoDB and MongoDB’s AI vector search. The result is called Purple Fabric, and it’s publicly available. This platform is now driving automation and augmentation in compliance and risk management. 

One of their major clients is a sovereign fund managing $1.5 trillion in assets across around 9,000 companies. Using Purple Fabric, they automated ASC compliance processes with up to 90% accuracy. 

This example shows that while enterprises may seem slower, companies like Intellect AI are enabling them to move quickly by building powerful tools tailored for complex environments. 

What recurring data architecture issues do you see in enterprise AI projects, and how does your company help address them? 

When you look at AI applications, it’s important to understand that the data used to build them is mostly unstructured. This data is often messy, comes from various sources, and appears in different formats such as video, audio, and text. Much of it is interconnected, and the overall volume is massive. Additionally, the data changes rapidly. 

These are the three core characteristics of AI data: it’s unstructured, high in volume, and constantly evolving. As a result, if you look at an AI application a year after it’s built without any updates, it’s likely already outdated. Continuous updates are essential. 

MongoDB stores data in a document format, unlike traditional systems that use rows and columns. Trying to store large volumes of unstructured and fast-changing data in a tabular format becomes unmanageable. You’d end up with thousands of tables, all linked in complex ways. Any change in one table could affect hundreds or thousands of others, making maintenance difficult. 

This is why many modern applications are built on MongoDB rather than on legacy systems. For example, Intellect AI uses MongoDB, as does DarwinBox, which uses AI to power talent search queries like finding the percentage of top-rated employees. Previously, this kind of semantic search would take much longer. 

Another example is Ubuy, an e-commerce platform with around 300 million products. They switched from a SQL database to MongoDB with vector search. Search times went from several seconds to milliseconds, enabling efficient semantic search. 

RFP.io is another case. It uses vector search to process and understand RFP documents, identifying which sections relate to topics like security or disaster recovery. This simplifies the process of responding to RFPs. 

As enterprises shift to unstructured data, vector search, and real-time AI, how is MongoDB adapting, and what key industry gaps still remain? 

The first step is collecting and using data in real time. For that, you need the right database. A document model is a natural fit for the scale and structure of this data. 

Once you have the data, the next step is using it effectively. That starts with full-text search, similar to how you search on Google. Most applications today rely on this kind of search functionality. 

But if you’re building AI applications, you also need to vectorize the data. Without vectorization, you can’t perform semantic searches or build meaningful AI features. 

At this point, companies usually face a choice. They often have data spread across multiple databases. To enable full-text search, they might add a solution like Elasticsearch. For semantic search, they bring in a vector database such as Pinecone. If they want to train or fine-tune models using internal data, they also need an embedding model. So now they’re managing a database, a full-text search engine, a vector search system, and an embedding model, each a separate component. 

The integration work required to get all these systems to operate together can consume a large amount of development and management time, pulling focus away from innovation. 

In contrast, our platform simplifies this. It uses a single document database to store all types of data. It includes Atlas Search for full-text search, built-in vector search, and now, with our acquisition of Voyage, integrated embedding capabilities. You don’t need separate systems for each function. 

With everything in one place, there’s no need for complex integration. You can run full-text and semantic (hybrid) search from the same platform. This reduces cost, simplifies management, and frees up time for innovation. Customers tell us this is the biggest advantage—they don’t need to stitch multiple tools together, which can be very hard to manage. 

What’s next for MongoDB in India, are you focusing on AI, edge deployments, cloud-native capabilities, or something else? 

Our focus will remain on three main areas. First, we’ll continue working with developers to help them improve their productivity. Second, we’ll collaborate across the ecosystem and with enterprises to build modern applications. Third, and most significantly, we’ll support large enterprises as they modernize their applications, whether by creating new ones or upgrading legacy systems. This includes helping them reduce technical debt, move away from outdated applications and databases like Oracle and SQL, and transition to more modern architectures that align with their goals. These are our three key priorities. 

Where do you see AI heading in the data modernization space over the next three to five years? 

In my view, we’re at a stage similar to the 1960s when computers and operating systems were just emerging. I see large language models (LLMs) as the new operating systems. We’re in the early phase, and what comes next is the development of applications built on top of them. As this evolves, more advanced and diverse applications will emerge. 

Building applications is becoming much easier. For example, there’s a concept called white coding, where even young children, eight or nine years old—can create apps. If a computer can guide you step by step, almost anyone can build one. That’s where we’re heading: a world where millions of applications can be developed quickly and easily. 

We see ourselves as a natural platform for these applications because we make it simple to store data. So, over the next few years, we expect a surge in development activity. A lot is going to change, and I think we’ll all be surprised by just how much. 

Loading...


Next Article

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


Google DeepMind Unveils AlphaGenome: A Unified AI Model for High-Resolution Genome Interpretation

MMS Founder
MMS Robert Krzaczynski

Article originally posted on InfoQ. Visit InfoQ

Google DeepMind has announced the release of AlphaGenome, a new AI model designed to predict how genetic variants affect gene regulation across the entire genome. It represents a significant advancement in computational genomics by integrating long-range sequence context with base-pair resolution in a single, general-purpose architecture.

AlphaGenome processes up to 1 million base-pairs of DNA at once and outputs high-resolution predictions across thousands of molecular modalities, including gene expression, chromatin accessibility, transcription start sites, RNA splicing, and protein binding. It allows researchers to evaluate the effects of both common and rare variants, not just in protein-coding regions, but in the far more complex non-coding regulatory regions that constitute 98% of the human genome.

Technically, AlphaGenome combines convolutional neural networks (CNNs) to detect local sequence motifs and transformers to model long-range interactions, all trained on rich multi-omic datasets from ENCODE, GTEx, 4D Nucleome, and FANTOM5. The architecture achieves state-of-the-art performance across a broad range of genomic benchmarks, outperforming task-specific models in 24 out of 26 evaluations of variant effect prediction.

A notable innovation is AlphaGenome’s ability to directly model RNA splice junctions, a feature crucial for understanding many genetic diseases caused by splicing errors. The model can also contrast mutated and reference sequences to quantify the regulatory impact of variants across tissues and cell types — a key capability for studying disease-associated loci and interpreting genome-wide association studies (GWAS).

Training efficiency was also improved: a full AlphaGenome model was trained in just four hours on TPUs, using half the compute budget of DeepMind’s earlier Enformer model, thanks to optimized architecture and data pipelines.

The model is now available via the AlphaGenome API for non-commercial research use, enabling scientists to generate functional hypotheses at scale without needing to combine disparate tools or models. DeepMind has indicated plans for further extension to new species, tasks, and fine-tuned clinical applications.

This release also aligns with a broader conversation around the interpretability and emotional context of AI in medicine. As Graevka Suvorov, an AI alignment researcher, commented:

The true frontier for MedGemma isn’t just diagnostic accuracy, but the informational and psychological state it creates in the patient. A diagnosis without context is a data point that can create fear. A diagnosis delivered with clarity is the first step to healing. An AI with a true ‘informational bedside manner’—one that understands it’s not just treating an image, but a person’s entire reality—is the next real leap in AGI.

AlphaGenome pushes the field closer to that vision, enabling deeper, more accurate interpretations of the genome and offering a unified model for understanding biology at the sequence level.

About the Author

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


MongoDB’s Strategic Pivot: Securing Its Future in High-Security Cloud Databases – AInvest

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

MongoDB, Inc. (NASDAQ: MDB) has long been a leader in NoSQL databases, but its recent strategic moves—securing inclusion in the Russell Midcap Value Index and pursuing FedRAMP High authorization—signal a bold pivot toward positioning itself as a high-security cloud database provider for government and regulated industries. This repositioning could unlock significant growth opportunities while attracting new investors.

The Russell Midcap Value Inclusion: A Strategic Rebranding

MongoDB’s addition to the Russell Midcap Value Index in early 2024 marks a pivotal shift in its valuation narrative. While MDB has historically been classified as a growth stock (its price-to-sales ratio remains elevated at ~10x), its inclusion in a value-oriented index reflects a maturing business model and improving profitability.

This reclassification is no accident. MongoDB has prioritized margin expansion and recurring revenue streams, with its flagship Atlas cloud service now contributing 70% of total revenue (up from 66% in 2024). The Russell Midcap Value Index inclusion could attract investors seeking stable, cash-flow positive companies in the tech sector—a demographic MDB has historically struggled to engage.

FedRAMP High: Unlocking the $100B Government Cloud Market

MongoDB’s pursuit of FedRAMP High and Impact Level 5 (IL5) certifications by 2025 is its most critical strategic move. These certifications will enable MongoDB Atlas for Government to handle highly sensitive data, including national security and health records, which are currently off-limits to its cloud platform.

The stakes are enormous: U.S. federal cloud spending is projected to hit $100 billion by 2027, with security-conscious agencies favoring providers that meet the strictest compliance standards. While MongoDB currently holds FedRAMP Moderate authorization, the FedRAMP High upgrade—subject to 421 stringent security controls—will open access to lucrative contracts with defense, intelligence, and healthcare agencies.

A Case Study in Success: The Utah Migration

MongoDB’s partnership with the State of Utah offers a blueprint for its government strategy. By migrating its benefits eligibility system to Atlas, Utah reduced disaster recovery time from 58 hours to 5 minutes, while cutting costs and improving speed. This win highlights Atlas’s ability to modernize legacy systems securely, a key selling point for agencies wary of cloud adoption.

Financials Support the Shift to Security

MongoDB’s financials back its strategic pivot:
Q1 2025 revenue grew 22% YoY to $450.6 million, driven by 32% growth in Atlas revenue.
Customer count rose to 49,200, with 73% of $1M+ customers increasing spend.
Margin expansion: Gross margins improved to 68% in Q1 2025, up from 65% in 2024.

These metrics suggest MongoDB is executing its shift toward high-margin, subscription-based cloud services while scaling its salesforce to target regulated sectors.

Risks and Considerations

  • Competition: AWS, Microsoft, and Snowflake are aggressively targeting the government cloud market.
  • Certification Delays: FedRAMP High and IL5 approvals are pending, and delays could push revenue growth below expectations.
  • Valuation: MDB’s stock trades at a premium relative to peers (e.g., Snowflake’s P/S of ~3x).

Investment Thesis: A Buy with Long-Term Upside

MongoDB’s strategic moves—Russell Midcap Value inclusion and FedRAMP High pursuit—position it to capitalize on a $100B+ addressable market in secure cloud databases. While short-term risks exist, the long-term opportunity for MDB to dominate regulated sectors justifies its valuation.

Buy recommendation: With a $430 price target from Citigroup (108% upside from current levels) and strong hedge fund support, MDB is a speculative but high-reward play for investors willing to bet on its security-driven growth.

Final Take

MongoDB’s pivot to high-security cloud databases is more than a rebrand—it’s a calculated move to tap into one of the fastest-growing segments of the tech industry. If it secures FedRAMP High by 2025, MDB could emerge as a must-have partner for governments and enterprises, justifying its premium valuation. For investors, this is a story worth watching closely.

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


MongoDB Announces Commitment to Achieve FedRAMP High and Impact Level 5 Authorizations

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

MongoDB, Inc. announced its commitment to pursuing Federal Risk and Authorization Management Program(FedRAMP) High and Impact Level 5(IL5) authorizations for MongoDB Atlas for Government workloads, which will expand its eligibility to manage unclassified, yet highly sensitive, U.S. public sector data. 

With FedRAMP High authorization, even the most critical government agencies looking to adopt cloud and AI technologies—and to modernize aging, inefficient legacy databases—can rely on MongoDB Atlas for Government for secure, fully managed workloads.

MongoDB Atlas for Government already provides a flexible way for the U.S. public sector to deploy, run, and scale modern applications in the cloud within a dedicated environment built for FedRAMP Moderate workloads. 

Achieving FedRAMP High and IL5 will allow MongoDB Atlas for Government’s secure, reliable, and high-performance modern database solutions to be used to manage high-impact data, such as in emergency services, law enforcement systems, financial systems, health systems, and any other system where loss of confidentiality, integrity, or availability could have a severe or catastrophic adverse effect on organizational operations, organizational assets, or individuals.

“The federal agencies that manage highly sensitive data involving the protection of life and financial ruin should be using the latest, fastest, and best database technology available,” said Benjamin Cefalo, Senior Vice President of Product Management at MongoDB. 

“With FedRAMP High and IL5 authorizations for MongoDB Atlas for Government workloads, they will be able to take advantage of MongoDB’s industry-leading and proprietary Queryable Encryption, multi-cloud flexibility and resilience, high availability with automated backup, data recovery options, and on-demand scaling, and native vector search to facilitate building AI applications.”

MongoDB Atlas for Government already helps hundreds of public sector agencies nationwide develop secure, modern, and scalable solutions. An integral feature of MongoDB Atlas for Government is MongoDB Queryable Encryption. 

This industry-first, in-use encryption technology enables organizations to encrypt sensitive data that helps organizations protect sensitive data when it is queried and in use on Atlas for Government. 

With Queryable Encryption, sensitive data remains protected throughout its lifecycle, whether it is in-transit, at-rest, in-use, and in logs and backups. It is only ever decrypted on the client-side.

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


MongoDB Announces Commitment To Pursuing FedRAMP High And Impact Level 5 …

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

<!–>

With upgraded authorization, soon federal agencies with security requirements at every level will be able to use MongoDB to deploy, run, and scale modern applications in the cloud

NEW YORK, June 30, 2025 /PRNewswire/ – MongoDB, Inc. (NASDAQ:MDB) today announced its commitment to pursuing Federal Risk and Authorization Management Program (FedRAMP) High and Impact Level 5 (IL5) authorizations for MongoDB Atlas for Government workloads, which will expand its eligibility to manage unclassified, yet highly sensitive, U.S. public sector data. With FedRAMP High authorization, even the most critical government agencies looking to adopt cloud and AI technologies—and to modernize aging, inefficient legacy databases—can rely on MongoDB Atlas for Government for secure, fully managed workloads.

–>

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


‘PostgreSQL Eats the World, But CockroachDB Digests It’ – Analytics India Magazine

MMS Founder
MMS RSS

Posted on nosqlgooglealerts. Visit nosqlgooglealerts

The database market is undergoing significant changes, driven by increasing demands for scale, resilience, and the burgeoning era of AI agents. 

Speaking exclusively to AIM, CockroachDB CEO Spencer Kimball stated that the shift towards distributed SQL databases built on a solid PostgreSQL foundation is becoming increasingly crucial for businesses of all sizes, not just tech giants.

The core difference offered by CockroachDB lies in its horizontal scaling capabilities. While it strives to maintain a PostgreSQL-like interface, distributed operations require a different approach. 

“Cockroach didn’t reject Postgres. It re-architected it from the ground up to meet the scale, distribution, and the consistency AI demands,” Kimball said.

He further added that scaling 100x on a monolithic architecture is utterly impossible. This, he explained, is where distributed SQL databases like CockroachDB come in, built for “serious scale, like hundreds of terabytes into petabytes” of operational data. “Postgres may be eating the world, but AI needs a database that can digest.”

Kimball said that he is particularly referring to operational databases and not the analytical ones. “It’s about the metadata that tracks the product or service, all the activity, and the high level of concurrent operations that demand strong consistency,” he added.

He explained that both humans and agents would have access to the data. These agents operate at high speed and are continuously active, performing the same tasks multiple times daily or even hourly. They work on behalf of both consumers and businesses, resulting in a steadily increasing volume of traffic.

What’s Next from CockRoach

Kimball sees AI playing a role in observability and support. “AI can move much faster. If you give it the right scenarios and train it, then what could have taken several hours to fix might only take several minutes,” he said.

Vector indexing is another area of focus for CockroachDB. “Customers want nearest-neighbour search in high-dimensional spaces at scale. They want it fast and consistent, even as data changes,” Kimball said.

But he clarified that CockroachDB is not trying to become a general-purpose vector database. Cockroach isn’t trying to compete with OpenSearch, Elastic, or MongoDB on vector search. “If you’re already using CockroachDB for mission-critical relational workloads, you want vector support there. Not everyone needs that, but for our users, it’s essential.”

He further added that they are not trying to win the market for the vector index. “We’re not a vector database. However, it’s a very important modality.”

Moreover, Kimball talked about reducing costs. “Nobody wants to pay 10x more because their workload scales 10x. CockroachDB can improve utilisation with multi-tenancy.” He explained that if a customer has 100 use cases on a large cluster, the peaks and troughs average out, allowing them to move from 10% to 50-60% utilisation.

The company is also working on using cloud cost efficiencies. Kimball said CockroachDB’s architecture allows the use of spot instances, disaggregated storage, and storage tiering. “We believe we can reduce costs by 10 to 16x in the next few years.”

Moat of Cockroach

Kimball said that CockroachDB’s strength is in geographic scale. “We have customers in the EU, the US, and India. If you want to make your service span all of those places, Cockroach has some really interesting capabilities that are different.”

He provided one example from the US sports betting sector. “Customers use Cockroach nodes in multiple states to comply with data locality laws. Data is processed where bets are placed.”

Moreover, he added that CockroachDB is cloud-agnostic and supports hybrid deployments. “Big banks and tech companies use private data centres and all three major clouds. We let customers run the database wherever their business needs it.”

One key challenge, he pointed out, is integrating AI into database operations. “It’s not easy to run distributed systems. When something goes wrong, you want the root cause before a human even looks at it. AI can help.”

On competing with cloud vendors, he noted, “They’re both competitors and partners. Big clouds don’t want to serve self-hosted enterprise customers, and those customers don’t want to be tied to one cloud. CockroachDB fits well there.”

He added that clouds often refer such customers to CockroachDB. “They say, ‘We can’t run this in your data centre, but CockroachDB can.’ That’s why the partnership works.”

As the era of AI agents increases data scale and complexity, CockroachDB is positioning itself to meet those demands through distributed design, cross-cloud flexibility, and AI-enhanced tooling.

Why Postgres 

Kimball explained how CockroachDB tries to stay close to the Postgres experience but adapts key behaviours to function at scale in distributed environments.“So well, it tries to look as much like Postgres as possible.”

One clear example was ID generation. Traditional Postgres allows for monotonically increasing sequences, such as auto-incrementing IDs for user records. In monolithic systems, this works smoothly, but things break down at a massive scale.

“In a monolithic system… that counter, it’s all just in one place… But once you say, I want to do 10 million of these concurrently… you don’t want them all going to one node that holds a counter.”

CockroachDB distributes the sequence generation process differently, making it scale-friendly but less linear. “It will look the same as a sequence. But… we have a more distributed mechanism to assign IDs… they’re not just counting 1,2,3,4,5.”

He acknowledged differences between Postgres and MySQL users as well. “Postgres does structured data, too. There’s room for both. 

Kimball said that the bigger challenge lies in how the databases are operated, not how they are used by applications. He said that system administrators and DBAs familiar with one will have a steeper learning curve when switching to the other, due to differences in tools, management styles, and best practices. 

“If you’re very good as a system administrator or like a DBA using Postgres, then it’s a lot more new stuff to learn. 

Kimball said that it often comes down to what teams are already used to operating. “If you’re good at MySQL, moving to distributed MySQL, then TiDB makes sense.” He was referring to TiDB CTO Ed Huang, who said that he believes MySQL will power AI agents.

Journey of the Cockroach

Cockroach Labs was founded in 2015 by ex-Google employees Kimball, Peter Mattis, and Ben Darnell. It draws inspiration from Google’s Bigtable and Spanner databases.

Kimball said that in the early 2000s, systems like Google’s Bigtable avoided SQL not out of dislike, but to keep things simple while focusing on scalability. “It was just easier not to have to do all that stuff and also build something that is elastically scalable and more survivable.”

However, over time, the industry began adding SQL features again. MongoDB added transactions. Google layered SQL on top of Spanner with F1

“They created a whole new distributed architecture, but they left all of the hard stuff and started adding the hard stuff back on top of it,” said Kimball. 

He added that NoSQL systems, such as Cassandra, offer flexibility and scalability but fall short in terms of consistency and schema management. “If you have 50 people working on a complex, mission-critical product… it just becomes impossible.”

By 2015 the CockroachDB team had a clear understanding of their target users which included big banks, major tech firms and other high-stakes organisations. 

Instead of building a new SQL dialect, they chose PostgreSQL. “Postgres felt like the cleanest and the most appropriate, and had the most upward velocity momentum.”

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


MongoDB Announces Commitment to Achieve FedRAMP High and Impact Level 5 Authorizations

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

#inform-video-player-1 .inform-embed { margin-top: 10px; margin-bottom: 20px; }

#inform-video-player-2 .inform-embed { margin-top: 10px; margin-bottom: 20px; }

With upgraded authorization, soon federal agencies with security requirements at every level will be able to use MongoDB to deploy, run, and scale modern applications in the cloud

NEW YORK, June 30, 2025 /PRNewswire/ — MongoDB, Inc. (NASDAQ: MDB) today announced its commitment to pursuing Federal Risk and Authorization Management Program (FedRAMP) High and Impact Level 5 (IL5) authorizations for MongoDB Atlas for Government workloads, which will expand its eligibility to manage unclassified, yet highly sensitive, U.S. public sector data. With FedRAMP High authorization, even the most critical government agencies looking to adopt cloud and AI technologies—and to modernize aging, inefficient legacy databases—can rely on MongoDB Atlas for Government for secure, fully managed workloads.

This page requires Javascript.

Javascript is required for you to be able to read premium content. Please enable it in your browser settings.

#inform-video-player-3 .inform-embed { margin-top: 10px; margin-bottom: 20px; }

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


The first step in modernization: Ditching technical debt | VentureBeat

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

Presented by MongoDB


Technical debt has long been the scourge of IT departments, but today it’s accumulating faster than ever. High-powered computing, technology innovations like AI and speed to market all require modern, scalable solutions. Unfortunately, many businesses are pressing forward with outdated systems and legacy applications, operating under the misconception that addressing technical debt just slows them down, because it demands time and budget the organization thinks it can’t afford to spend. But in today’s landscape, what organizations really can’t afford are the huge hidden costs of legacy applications, which all directly impact performance, security, and innovation.

“Modernization isn’t just about catching up — it’s about building a future-ready foundation for innovation,” says Paul Done, field CTO, modernization at MongoDB. “The true cost of the status quo isn’t just inefficiency — it’s missed opportunities when the market demands agility, making developers use their valuable time to keep legacy architecture running, versus positioning the company for AI with modern infrastructure and applications.”

The mounting costs of technical debt

IT leaders are well aware of the concrete costs that legacy systems entail. For example, the IT team at a top bank reached out to MongoDB when they discovered that out of their team’s $16 million IT budget, $15 million was being spent just on maintaining legacy architecture. That left the bank with only $1 million for innovation.

There are also hidden costs that directly impact performance, security, and innovation. Not all infrastructures are built for the modern, transformative applications that are vital in today’s competitive market. Furthermore, developer productivity is hampered by technology built on outdated code, which makes it difficult for developers to maintain and implement new features, and also lacks the scalability and resilience required to support modern user demands and development practices.

Plus, these systems make organizations significantly more vulnerable to threats because outdated, brittle architecture can be difficult to update or secure. Some companies lack the necessary institutional knowledge or visibility into the underlying legacy code, which also increases vulnerability. And a lot of these systems are simply not compliant, or no longer supported, increasing the inherent new risks that AI and other modern applications can add to a technology stack. Innovation is completely hamstrung, unless businesses address these potential security gaps. 

“To overcome these challenges and come up to speed in a fast-paced world, organizations need to adopt flexible, high-performance data platforms,” Done says. “By doing so, they’ll reduce infrastructure complexity and maintenance overhead. Modern databases also help organizations improve security with encryption, compliance tools, and automated updates, and architecture designed for high-performance applications helps them scale. All this accelerates AI adoption by enabling real-time, high-quality data access.”

Assessing the extent and impact of architectural limitations

At a high level, determining when it’s time to modernize is about quantifying cost, risk, and complexity. In dollar terms, it may seem as simple as comparing the expense of maintaining legacy systems versus investing in new architecture. But the true calculation includes hidden costs, like the developer hours lost to patching outdated systems, and the opportunity cost of not being able to adapt quickly to business needs.

True modernization is not a lift-and-shift — it’s a full-stack transformation. That means breaking apart monolithic applications into scalable microservices, rewriting outdated application code into modern languages, and replacing rigid relational data models with flexible, cloud-native platforms that support real-time data access, global scalability, and developer agility.

Many organizations have partnered with MongoDB to achieve this kind of transformation. For example, to ensure they didn’t give up any of their performance, storage capacity or support benefits, Indeed tapped MongoDB to streamline their infrastructure efficiency. In just six months they reduced total costs by 27% — far exceeding the company’s initial goals for its modernization initiative.

Security must also be factored in, assessing how much risk legacy systems add to the organization’s overall security posture. And from an operations and innovation perspective, it’s critical to account for future-forward objectives and overall goals. That’s why Bendigo Bank worked with MongoDB to modernize its core banking technology, leveraging generative AI to modernize the bank’s legacy Agent Delivery System (a retail teller operation) in less than three months. The bank was eager to enable its developers to focus on more meaningful innovation so the bank could remain agile in a fast-moving market.

Overall, Bendigo Bank migrated onto MongoDB Atlas at one-tenth of the cost of a traditional legacy-to-cloud migration. Plus, MongoDB solutions helped reduce the development time required to migrate a core banking application off of a legacy relational database to MongoDB Atlas by up to 70%. With new AI tooling, they automated repetitive developer tasks to accelerate developers’ pace of innovation. For example, AI-powered automations reduced time spent running application test cases from over 80 hours to just five minutes.

But modernization projects are usually a balancing act, and replacing everything at once can be a gargantuan task. Choosing how to tackle the problem comes down to priorities, determining where pain points exist and where the biggest impacts to the business will be. The cost of doing nothing will outrank the cost of doing something.

For instance, Toyota Connected recently experienced reliability issues with the legacy database solution underlying the telemetry-based technology that powers connectivity solutions like Safety Connect in more than 9 million Toyota and Lexus vehicles in North America. The company decided to migrate to Amazon Web Services (AWS) and MongoDB Atlas, an integrated suite of data services centered around a cloud database designed to accelerate and simplify building with data. Safety Connect has attained 99.99% availability and the company aims for that number monthly, according to Toyota Connected’s internal measurements.

“We’re usually going in and tackling some of a company’s biggest, ugliest applications,” Done says. “How you design a solution in this AI era is about finding that right partner who can help evolve not only your applications, but your supporting database to consolidate workloads, reduce complexity, and adapt in a rapid, agile way.”

How modern database solutions enable AI-driven workloads

AI is often a game-changing catalyst — once technical debt is eliminated, a company can embrace all the potential it offers. In order to react instantly and make real-time decisions in things like dynamic pricing, fraud detection, adaptive user experiences, and more, AI solutions depend on fluid, instantly-accessible data. Modern databases can make this happen by consolidating structured and unstructured data to help organizations scale without constraints, and to adapt to AI workloads, massive data volumes, low latency operations, and meet the demands of any AI workload while protecting sensitive information both at rest and in motion.

While modernization is often seen as complex and time-consuming, MongoDB has helped speed up and simplify the modernization process in a repeatable manner for many organizations, providing full-stack modernization at both the data and application layer tailored to a company’s specific architecture. The company’s seamless data model and distributed architecture are built to manage data at scale as new technologies emerge, making it the perfect foundation for AI-powered applications. These solutions make developers at least 50% more productive, with some customers seeing productivity gains as high as 70%, Done says.

For Lombard Odier, a gen AI-assisted modernization initiative with MongoDB enabled the bank to migrate code three times faster than previous migrations; move applications from legacy relational databases to MongoDB twenty times faster; and automate repetitive tasks with AI tooling to accelerate the pace of innovation, reducing project times from days to hours. 

The bank’s largest application, PMS (which has thousands of users) manages shares, bonds, exchange-traded funds, and other financial instruments. MongoDB’s ability to scale was key to this system migration, as this system is used to monitor investments, make investment decisions, and generate portfolio statements.

“MongoDB’s AI-powered software-driven approach fully modernizes data and applications at scale in a simplified way,” he explains. “We deliver high-impact results in a short timeframe. We’ve got more than 17 years of experience creating best practices and modern, data-driven applications, so we’re uniquely positioned to understand the ideal end-state of applications for modernization and how to achieve it.”


Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact sales@venturebeat.com.

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


MongoDB Announces Commitment to Achieve FedRAMP High and Impact – GuruFocus

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

  • MongoDB (MDB, Financial) is pursuing FedRAMP High and Impact Level 5 (IL5) authorizations.
  • Currently trusted by 13 U.S. Federal Cabinet-level agencies and all Department of Defense branches.
  • State of Utah implemented MongoDB Atlas for a 25% increase in benefits processing speed.

MongoDB, Inc. (MDB), a leading name in database solutions, has announced its commitment to achieve Federal Risk and Authorization Management Program (FedRAMP) High and Impact Level 5 (IL5) authorizations for its product, MongoDB Atlas for Government. This strategic move will enable MongoDB to manage highly sensitive, unclassified data for the U.S. public sector, expanding its market reach within federal agencies that demand stringent security measures.

The MongoDB Atlas for Government platform, operating currently at a FedRAMP Moderate level, is already a trusted solution for 13 U.S. Federal Cabinet-level agencies, including every branch of the Department of Defense. The platform employs MongoDB’s proprietary Queryable Encryption technology to ensure data protection throughout its lifecycle, which is increasingly critical for federal clients.

Mongodb’s Queryable Encryption is a first-of-its-kind in-use encryption technology, designed to maintain the privacy and security of sensitive data during its entire lifecycle—whether in transit, at rest, or in use. Furthermore, decryption happens only client-side, ensuring additional layers of data security.

A notable success story is the State of Utah, which utilized MongoDB Atlas for Government to manage state benefit eligibility data. The state reported a 25% boost in the speed of benefits calculations, drastically reduced management times, and an improved disaster recovery time, from 58 hours to just 5 minutes.

This move to secure FedRAMP High and IL5 authorizations aligns with MongoDB’s ambition to support critical government sectors such as emergency services, law enforcement, financial, and health systems. Achieving these authorizations will allow MongoDB to serve federal agencies managing the most sensitive unclassified data, positioning it as a formidable competitor for high-value, long-term government contracts.

Become a Premium Member to See This: (Free Trial):

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


MongoDB Announces Commitment to Achieve FedRAMP High and Impact Level 5 Authorizations

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

With upgraded authorization, soon federal agencies with security requirements at every level will be able to use MongoDB to deploy, run, and scale modern applications in the cloud

NEW YORK, June 30, 2025 /PRNewswire/ — MongoDB, Inc. (NASDAQ: MDB) today announced its commitment to pursuing Federal Risk and Authorization Management Program (FedRAMP) High and Impact Level 5 (IL5) authorizations for MongoDB Atlas for Government workloads, which will expand its eligibility to manage unclassified, yet highly sensitive, U.S. public sector data. With FedRAMP High authorization, even the most critical government agencies looking to adopt cloud and AI technologies—and to modernize aging, inefficient legacy databases—can rely on MongoDB Atlas for Government for secure, fully managed workloads.

MongoDB Atlas for Government already provides a flexible way for the U.S. public sector to deploy, run, and scale modern applications in the cloud within a dedicated environment built for FedRAMP Moderate workloads. Achieving FedRAMP High and IL5 will allow MongoDB Atlas for Government’s secure, reliable, and high-performance modern database solutions to be used to manage high-impact data, such as in emergency services, law enforcement systems, financial systems, health systems, and any other system where loss of confidentiality, integrity, or availability could have a severe or catastrophic adverse effect on organizational operations, organizational assets, or individuals.

“The federal agencies that manage highly sensitive data involving the protection of life and financial ruin should be using the latest, fastest, and best database technology available,” said Benjamin Cefalo, Senior Vice President of Product Management at MongoDB. “MongoDB is trusted by 13 U.S. Federal Cabinet-level agencies, every branch of the Department of Defense, and a wide range of Intelligence Community partners. Agencies such as the National Oceanic and Atmospheric Administration (NOAA), the Food and Drug Administration (FDA), and the U.S. Department of Health and Human Services (HHS) are building applications powered by Atlas for Government to solve their most challenging data requirements. With FedRAMP High and IL5 authorizations for MongoDB Atlas for Government workloads, they will be able to take advantage of MongoDB’s industry-leading and proprietary Queryable Encryption, multi-cloud flexibility and resilience, high availability with automated backup, data recovery options, and on-demand scaling, and native vector search to facilitate building AI applications.”

MongoDB Atlas for Government already helps hundreds of public sector agencies nationwide develop secure, modern, and scalable solutions. An integral feature of MongoDB Atlas for Government is MongoDB Queryable Encryption. This industry-first, in-use encryption technology enables organizations to encrypt sensitive data that helps organizations protect sensitive data when it is queried and in use on Atlas for Government. With Queryable Encryption, sensitive data remains protected throughout its lifecycle, whether it is in-transit, at-rest, in-use, and in logs and backups. It is only ever decrypted on the client-side.

For example, the State of Utah, which has one of the nation’s fastest-growing populations, chose MongoDB Atlas for Government to store its state benefit eligibility data. To meet a statewide mandate set by the governor of Utah, the State of Utah’s Department of Technology Services needed to migrate its eligibility software out of its physical data center to a FedRAMP-compliant cloud solution. The state government administration recognized it needed a backend database that could reliably handle large documents and deliver results quickly, which led it to identify MongoDB Atlas for Government as an ideal solution. The migration resulted in a 25% increase in speed of benefits calculations and document returns, reduced management time, and a 5-minute point-in-time recovery, compared to a recovery time of up to 58 hours when running on premises.

“It’s much less cumbersome to maintain our databases now that we’re using the fully managed MongoDB Atlas for Government. We tried some other solutions, but they could not match MongoDB,” said Manoj Gangwar, Principal Data Architect at the Department of Technology Services for the State of Utah.

For more information about MongoDB Atlas for Government, visit https://www.mongodb.com/products/platform/atlas-for-government.

About MongoDB

Headquartered in New York, MongoDB’s mission is to empower innovators to create, transform, and disrupt industries with software. MongoDB’s unified database platform was built to power the next generation of applications, and MongoDB is the most widely available, globally distributed database on the market. With integrated capabilities for operational data, search, real-time analytics, and AI-powered data retrieval, MongoDB helps organizations everywhere move faster, innovate more efficiently, and simplify complex architectures. Millions of developers and more than 50,000 customers across almost every industry—including 70% of the Fortune 100—rely on MongoDB for their most important applications. To learn more, visit mongodb.com.

Forward-looking Statements
This press release includes certain “forward-looking statements” within the meaning of Section 27A of the Securities Act of 1933, as amended, or the Securities Act, and Section 21E of the Securities Exchange Act of 1934, as amended, including statements concerning MongoDB’s intent to achieve FedRAMP High and Impact Level 5 Authorizations. These forward-looking statements include, but are not limited to, plans, objectives, expectations and intentions and other statements contained in this press release that are not historical facts and statements identified by words such as “anticipate,” “believe,” “continue,” “could,” “estimate,” “expect,” “intend,” “may,” “plan,” “project,” “will,” “would” or the negative or plural of these words or similar expressions or variations. These forward-looking statements reflect our current views about our plans, intentions, expectations, strategies and prospects, which are based on the information currently available to us and on assumptions we have made. Although we believe that our plans, intentions, expectations, strategies and prospects as reflected in or suggested by those forward-looking statements are reasonable, we can give no assurance that the plans, intentions, expectations or strategies will be attained or achieved. Furthermore, actual results may differ materially from those described in the forward-looking statements and are subject to a variety of assumptions, uncertainties, risks and factors that are beyond our control including, without limitation: our customers renewing their subscriptions with us and expanding their usage of software and related services; global political changes; the effects of the ongoing military conflicts between Russia and Ukraine and Israel and Hamas on our business and future operating results; economic downturns and/or the effects of rising interest rates, inflation and volatility in the global economy and financial markets on our business and future operating results; our potential failure to meet publicly announced guidance or other expectations about our business and future operating results; liabilities, reputational harm or other adverse consequences resulting from use of AI in our product offerings and internal operations if they don’t produce the desired benefits; our limited operating history; our history of losses; our potential failure to repurchase shares of our common stock at favorable prices, if at all; failure of our platform to satisfy customer demands; the effects of increased competition; our investments in new products and our ability to introduce new features, services or enhancements; social, ethical and security issues relating to the use of new and evolving technologies, such as artificial intelligence, in our offerings or partnerships; our ability to effectively expand our sales and marketing organization; our ability to continue to build and maintain credibility with the developer community; our ability to add new customers or increase sales to our existing customers; our ability to maintain, protect, enforce and enhance our intellectual property; the effects of social, ethical and regulatory issues relating to the use of new and evolving technologies, such as artificial intelligence, in our offerings or partnerships; the growth and expansion of the market for database products and our ability to penetrate that market; our ability to maintain the security of our software and adequately address privacy concerns; our ability to manage our growth effectively and successfully recruit and retain additional highly-qualified personnel; and the price volatility of our common stock. These and other risks and uncertainties are more fully described in our filings with the Securities and Exchange Commission (“SEC”), including under the caption “Risk Factors” in our Quarterly Report on Form 10-Q for the quarter ended April 30, 2025, filed with the SEC on June 4, 2025 and other filings and reports that we may file from time to time with the SEC. Except as required by law, we undertake no duty or obligation to update any forward-looking statements contained in this release as a result of new information, future events, changes in expectations or otherwise.

Media Relations
MongoDB 
press@mongodb.com

Cision View original content to download multimedia:https://www.prnewswire.com/news-releases/mongodb-announces-commitment-to-achieve-fedramp-high-and-impact-level-5-authorizations-302493833.html

SOURCE MongoDB, Inc.

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.