Month: May 2023

MMS • Shiva Nathan
Article originally posted on InfoQ. Visit InfoQ

Subscribe on:
Transcript
Shane Hastie: Good day, folks. This is Shane Hastie for the InfoQ Engineering Culture Podcast. Today, I’m sitting down with Shiva Nathan. Shiva is the founder of Onymos and has a background working in high security environments. Shiva, welcome. Thanks for taking the time to talk to us today.
Shiva Nathan: My pleasure being here, Shane. It’s a great time to be talking about security and privacy and such.
Shane Hastie: Thank you so much. So, probably a useful starting point, who’s Shiva?
Introductions [01:26]
Shiva Nathan: I’m the founder and CEO of a company based out of Silicon Valley here in the San Francisco Bay Area in California. We are a company that’s trying to invent the next technology abstraction when it comes to enterprise applications. So, you can look us up at onymos.com O-N-Y-M-O-S.com. The name itself came about to be the antonym of anonymous, so we wanted the antonym of word to know about us. That’s how the name Onymos came about. That’s a fun fact for you there.
Shane Hastie: And now, chatting earlier, we were talking about the importance of privacy in particularly data privacy, personal privacy. Your background working in that high security space, what have you seen and what should we be thinking about as technologists working in that space?
Data privacy and retention needs to consider the implications over multiple decades [02:11]
Shiva Nathan: As technologists, we are to start doing everything that we do with a very, very long-term time horizon in mind, and I’m not talking about 15 years, 20 years, or something. I’m actually literally talking about what is the data or information that you are leaving behind today that might affect your great-great,-great-grandchild. If you start to think, it’s a very different view of security and privacy, and I’ll give you a particular specific example. It might be out there, but I’ll still give you a specific example. Imagine five generations from now, your great-great-grandchild cannot get insured because you actually put out there about something that you did or did not do for your health, and your great-grandchild gets denied his or her insurance because of that information that you put out there. You are long gone. You are in your grave rolling probably, and your great-grandchild is doing that. That’s on the personal side.
But on your work side, the stuff that we are building right now are probably going to be lingering around in some fashion for 30, 40, 50 years being used. With people that are using it, not even thinking about who built it or how it was built or under what conditions it was built. And if they run into a problem, it was caused by you, and you’re probably long gone and retired from your day job and still that can linger on. So, if you approach every single problem with this long-term vision in mind as a technologist, it’ll actually help make the right decisions and do the right things when it comes to either of these aspects, when it comes to security and privacy.
Shane Hastie: Organizations don’t think long term.
Shiva Nathan: Very true, they don’t, and that’s unfortunate in some sense. But then, if you actually look at organizations paying for problems that’s created, and if you actually go do a root cost analysis of the problem that organizations have created, you’ll actually find the origin of it, the decision or the non-decision of it from years back. So, all of us are familiar with the Boeing problem and the Boeing planes crashed killing people because of some application software that took control of the plane and plummeted them into land and oceans, right? Go back and do a root cause analysis for that problem, and you’ll find that that particular decision or non-decision was made years back, and it slowly percolates up, and then infects itself into the organization. And then, finally comes to literally killing people years after such decisions are made. Security and privacy are privacy averse.
Shane Hastie: What are some practical things I can do as a technologist?
Be very deliberate about what data is collected and how long it needs to be retained [04:38]
Shiva Nathan: Start with thinking data. What data do I really need, and what data do I not need? We are at a point in inflection in the technology where storage is cheap, and analytics is the biggest thing that everyone wants to do. And since storage is cheap, analytics is good, I want to capture each and every piece of information and data out there to store it, not even knowing whether I’m going to use it or not.
Let’s take this one particular podcast for example. So, as people that are going to be listening to this podcast are thinking that “Okay, let’s save this podcast. Let’s save Shiva’s voice and Shane’s voice somewhere, and let’s store this information whether we want it or not. Let’s store other information about Shiva and Shane whether we want it or not.” Such storing of information “Where is Shane speaking from? Where is Shiva speaking from? Is it even relevant to the conversation that we are having?” But all that information is being stored in the podcast. And that data collection, what I call as unnecessary data collection, is what is going to do technologies in. So, you are collecting data without even having a current use for it, thinking, and hoping that “Hey, that might be a time when I might need it, and I should be able to go back to it.” And capturing the data is the one that’s going to do you in when people hack into the data and find out and stitch where Shane lives, how old is he, and how many children he has, and all of this different stuff to do a technology attack on Shane and Shane’s bank accounts. And all of that was caused by some innocuous data collection to begin with.
And then, the second thing is write one data. Let’s say you do have to collect this data. Start to think from day one in terms of “How long do I really need this data?” Okay. I’m collecting a data about a particular person or a particular enterprise. How long? Think about the expiry of the data when you collect the data. Not many enterprises are very disciplined about doing that. If you put an expiry tag onto the data when you collect it, you would, and if you have a process in place to go delete the data on that expiration date, it’ll do your world of good.
And the third thing that I talk about is if you don’t do that is you start to get nostalgic about data, and the nostalgia of data is a bigger problem.
Well, the first part is that don’t collect unnecessary data. Second is put an expiration on the data, and do not get nostalgic about data. So, as technologists, if we follow these three principles, I think you’ll be in a much better place when it comes to data security. Then, there are a lot of other aspects of security and privacy.
Shane Hastie: So, thinking of your experience working in high profile organizations where they were the target of, as you mentioned, state and non-state actors, how do you build platforms to be secured?
Shiva Nathan: Building secure systems starts with attitude and mindset
Start with the people. Again, it might be the different way of thinking. Start with the people first, the people that are actually building these platforms. What I’m going to tell in you this podcast is not about, “Oh, buy this tool, buy that tool. Implement this, implement that, implement the process and stuff.” It all comes boiling down to the people that are building this stuff. People and their abilities and in many a times their lack of imagination is what is the Achilles heel for secure platforms.
In my previous job and in my current job, when we are building platforms that enterprise applications are going to be built in, the first thing that we start with is how much of the time and their mental focus are my engineers focusing on privacy and security? How much are they diligent about following the process that we lay around? How much are they proficient in the tools that we’re going to give them? It all starts with the people. So, once you have the people area covered, and you know that if every engineer in your team is first of all have their mindset on doing it the highest private way and highest secure way, if you have those two aspects covered, then, you go tell them about the process, you go tell them about using of the tools, then, you go up, tell them about other aspects and other vulnerabilities that they’re going to put in all kinds of testing that they have to do, everything comes easy from there. It flows from there.
Shane Hastie: And what do we build on top of that?
Building for security starts with data at the centre [08:33]
Shiva Nathan: Once you have the people in place, then, start with data at your centre. So, you have the people as your rock bottom foundation, people and the mindset. And then, once you have the people as your rock bottom foundation, then, start with the data, and I’ve already touched upon this data. So, data, very simply think about what is absolutely necessary, what is not relevant, and collect only what is absolutely necessary. Do not collect data that you think you might need it one year from now, two years from now. Do not do that. Then, once you collect the data right at the time of collection, put an expiration tag on the data.
And then, the third thing is do not get nostalgic about data. When the expiration clock ticks in, go delete the data subject to federal regulations, data collections, whatever regulations you have, go delete that data. Once you have the data in place, then, the second thing is your engineering practices. In terms of development, testing and such, right from the beginning, make them super robust. Find out secure coding practices. All your engineers should be proficient in secure coding practices. Then, your CI/CD pipeline, continuous integration/continuous delivery pipeline, they need to be set up to catch security vulnerabilities.
Then, go look at tools that you have already in place and tools that you can augment that continuously keep checking your security vulnerabilities and stuff. Then, go into your release and deployment process, and follow the same thing in terms of how secure and stuff. And more important thing is that most companies do this one time, and the thing is like one and done. It’s never about one and done. It’s continuous evolution. If you’re still running your iPhone on iOS 15, you are in bad shape because there are a lot of security updates that have come up.
Keep platforms and tools up to date [10:11]
Imagine an enterprise running an application on the latest and greatest releases of everything that their stack depends on. It’s a lot of hard work. It’s a lot of hard work to make an application that you’re developing to be running on the latest and greatest. Most of the times, the latest and greatest don’t even work with each other. Finding that denominator that makes things work will bring you down in security levels multiple fold.
Think about what are the things that you can do. Let’s say there are two components that you’re depend on. One is on version two, one is on version three, and you want to upgrade one of them up one version and those two don’t get along. What are the security fixes that I can at least bring in to make them work along so that I’m not open security wise?
To summarize, start with the people, think about the data, think about the entire process, and keep thinking about it continuously, and always stay on the latest and greatest of technologies because that’s what is going to keep you safe.
Shane Hastie: So, let’s go right to the base then and start with the people. How do you create an environment where the safety security thinking is pervasive, and the culture is generative and engaging?
Enabling a security conscious culture [11:16]
Shiva Nathan: All of us as human beings have grown with biases and such. Especially, a newer generation that has grown with social media and such. It is second nature for them to expose most of the lives onto the social forums, that many that are born 40, 50 years back would not even think about doing. Whether or not someone is interested, people start to post attributes about their life in terms of what they ate for breakfast, where they went on vacation and such. I’m not saying don’t go tell your engineers that now you work for the CIA, and you’re not supposed to do all of that particular stuff, but make them start to think in terms of, “This information that you are really sharing with the world, do you understand the ramifications?” Once you understand the ramifications and once you put out your data out there, then, if you’ve gone to the thought process, then you go to whatever you want.
Once you build that particular culture where you educated your team, in terms of what is data, what is data privacy, what is data security, and then, you make them work on your application development as a technologist and such. And then, giving them more time and praising them for the efforts that they put in to be security focused, to be privacy focused. As leaders, we need to walk the walk and talk the talk or rather walk the talk to be sure, to be able to say, “I have really acknowledged and appreciated an engineer in my team that took the extra effort to make something a little bit more robust and a little bit more secure.” So, that’s how you build that culture within the company.
Shane Hastie: We were talking earlier about what makes a good leader, and you had some counterintuitive advice in terms of what comes first.
Employees come first, customers second and investors third [12:50]
Shiva Nathan: Employees come first any day. I have this fundamental belief that I’ve got it from my previous employers and stuff. The employees come first, customers come second, investors come third. With extreme pride, I tell my investors that they come third, and when you make employees come first, if you take really good care of your employees, your employees will do the right thing for the customers. And if your customers have taken really good care of, they automatically will take care of your bottom line and investors.
You try to put this in any other order, employees, customers and investors. And selfishly, I’m an employee as well, so it actually bodes well for me too. So, if you put employees first, customers second and investors third, the system all works diligently. If you try to put customers first, a lot of companies out there that actually go out and say, “Customers first,” and do substandard decisions for their employees, what happens is that unhappy and discredited employees don’t always go the extra mile to make the customers happy. And when your customers are not made extremely happy, your investors suffer. There are some people out there say they are looking for the next fundraising and go, “I’m going to put my investors first”. You can build a rocket ship or a meteor that actually launches and grows very fast for the investors for the next three, five years, and then crashes and burns because you are not taking care of your customers, because you’re making suboptimal decisions for your customers, because you put your investors first. And if you make suboptimal decisions for your employees, they don’t take care of the customers.
Going by this logic of employees first, customer second, investors third, always, always is the right approach in my mind to build a long-lasting company that really succeeds and takes care of all of these three stakeholders.
Shane Hastie: If we look around the tech industry at the moment, we’re recording this in March 2023. See a lot of behavior that has definitely not been putting employees first. The massive layoffs. And I can understand that layoffs at times are necessary in economic climes, but what has worried me in this round is the inhumane way that that has been dealt with or certainly what has been reported on the social media. We arrive at work and can’t log in, or I’m working remotely, and my ID no longer works, and that’s how I know I was let go. What happened that organizations got it so wrong?
The impact of the way layoffs are handled [15:08]
Yes. It is unfortunate what we are going through in the economic climate. Although, I would not say that the company making a layoff makes it a non-employee first company or anything. Layoffs are a necessary part of an economic cycle, but how that is handled is what is important. If you have an employee or a particular division that is not got attraction for no fault of a single employee. So, let’s say there’s a division within a large company try to introduce a product. The product did not get attraction, and the company decides to lay off the entire department, that is the right thing for the company. But how you treat the employees when you are going to lay them off speaks a ton about the culture, and tells the current existing employees, the existing employees how they’re going to be treated going forward.
The particular example that you talked about where a person wakes up in the morning, finds out that they are no longer able to log into the system, and that’s how they get to know that they’re laid off. In this day and age, that to me is horrendous. It could have taken just one email to the person. Even if the company’s laying off 10,000 people and they don’t have an HR that can sit across a table from 10,000 people and have the conversation to tell them gracefully that “Yes, you got laid off.” A simple email would have been understandable. A simple email informing very clearly and nicely that, “Sorry, we have to make this hard decision to let you go,” would have served those companies really well, and that is unfortunate.
And then, there are companies where HR got laid off along with the people that there was no one to do it, but it’s on their leaders then. It’s on the leaders to be able to be upfront. And let’s say, I’m running a company where I would let go of 10,000 people. Let’s say I would let go 99, but I hope I’d never come to that situation ever in my life, but let’s say the entire company has to be shut down. As long as a leader comes out and send out a mass email even saying why that leader is shutting down that company, and what are the reasons, and then also explaining “Why am I sending a mass email because I cannot afford to send 10,000 emails one by one and for the sake of time, I am doing that through a mass email,” that will go a long way. Being upfront, being a person of high integrity and showing leadership, that will go a long way. Then, employee waking up in the morning and finding out that they cannot access their corporate system and that’s why they know they’re laid off.
Shane Hastie: These are not difficult concepts, but we see them breached all too often. What does this tell us, I wonder about our industry or elements of our industry? That’s getting into philosophy, and we might not go too far there.
Human connections got reduced through the pandemic [17:33]
Shiva Nathan: No. I think what the pandemic did to the world is that it actually reduced the level of human connections for people in the last two years that people were hunkered down in the homes. So, there are some managers that I know I’ve never met their employees in person ever. So, if I’ve never met you in person, I’ve always seen you as a two-dimensional object on a Zoom or a Microsoft Teams call, that personal connection doesn’t get formed, so it’s easy for you to wake up and find out that your employee access is not working anymore, and it does not affect me personally. The interpersonal connection is gone because of the pandemic.
I’m getting into the philosophical realm now. I think we as human beings, like what we did, Shane, before this podcast started, to be able to talk about each other’s personal lives a little bit. Not go too much into it, but at least establish that human to human connection, and you offering to tell me, “Hey, if you’re in my neighborhood, come by and say hi,” that’s a good human interpersonal connection, and I’ll remember that for months and years to come. And when I’m ever in your neighborhood, I’ll be like, “Yes, I know a person, Shane, that I can come by and say hi, and meet you all for lunch.” And that only happens when we take the extra effort as human beings in every interaction that we have to go about “Why am I here? Oh, I’m here to do this podcast. Shane’s here to record the podcast,” and just have the conversation go away. That’s not the case.
I really appreciate the time that you took to get to know me as a person and for offering for me to get to know you as a person. We have take the extra effort. If you are scheduled this to be like a 30-minute thing, 20-minute I’m off to my next thing, we won’t have been able to make that connection. I think every human being has to do that in this world.
Shane Hastie: Build in the time to be human.
Shiva Nathan: Yep. You said that lot succinctly than I did.
Shane Hastie: Shiva, thank you so very much for taking the time to talk to us. If people want to continue the conversation, where do they find you?
Shiva Nathan: I’m on Twitter @_ShivaNathan, and they can also follow my company Onymos, O-N-Y-M-O-S on Twitter as well. And I’m also on LinkedIn, so people can connect with me on LinkedIn. You can look me up as Shiva Nathan on LinkedIn.
Shane Hastie: Wonderful. Thank you so much.
Shiva Nathan: Thank you so much, Shane. Thanks for having me. My pleasure.
Mentioned
.
From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

MMS • Sergio De Simone
Article originally posted on InfoQ. Visit InfoQ

AirBnb has historically managed its own fleet of Macs to run its iOS continuous integration pipeline. Thanks to AWS providing support for Macs, AirBnb engineers could migrate their iOS CI infrastructure to AWS to increase flexibility, consistency, and efficiency.
Contrary to how AirBnb manages the rest of platforms it supports, the CI pipeline for iOS did not originally run on AWS. In fact, using Macs enabled leveraging Apple tools to build the iOS app and run its tests, but increased maintenance costs and reduced testing efficiency.
In particular, an engineer had to ensure the entire fleet of Macs was enrolled in AirBnb mobile device management tool as well as that they ran the latest version of macOS and Xcode. Having a dedicated engineer did not prevent the possibility that some Macs could enter a bad state and be excluded from the pipeline. Updating to the latest Xcode also posed a number of issues, with the potential effect of further reducing the fleet of available machine for testing and maintenance cost.
The availability of macOS AMIs, explains AirBnb engineer Michael Bachand, allowed AirBnb to move its CI pipeline to AWS, with a number of distinct benefits. Those include the possibility of using a single AMI used to automatically launch any additional instance required as well as consolidating the CI infrastructure across all supported platforms.
AirBnb uses HashiCorp Packer to build AMIs. The tool is able to launch and configure an EC2 instance based on a template defined in the HashiCorp configuration language (HCL). This step can be fully automated and the Packer template can be version-managed using Git.
Another essential component in AirBnb CI solution is Terraform, which is used to deploy the iOS CI infrastructure to AWS in a way similar to other supported platforms.
All of our AWS infrastructure for iOS CI is specified in Terraform code that we check into source control. Each time we merge a pull request related to iOS CI, Terraform Enterprise will automatically apply our changes to our AWS account.
Scaling is handled through Auto scaling groups using a modified version of buildkite-agent-scaler. Instances are created from a launch template, which specifies all of its characteristics, including the AMI and a “launch” script.
This CI setup provides a number of advantages, says Bachand. On the one hand, it makes it easy to create a new CI environment for each required CPU architecture and Xcode version. When the CI environment is not used, it can be disabled changing the Auto scaling group configuration.
Another advantage of this setup is the possibility of rotating instances by terminating and replacing them daily. This reduces the chance that instances drift by clearing their internal SSD, the NVRAM variables, and updating the firmware. In case an instance experience a drift anyway, it is easy to exclude it from the autoscaler and add a new, clean instance.
Thanks to this approach, writes Bachand, AirBnb has been able to get the benefits of virtualization without the performance penalty of using virtual machines and to unify how they handle their CI infrastructure for every supported platform. Do not miss the original article if you are interested in the full detail about their migration.

MMS • Almir Vuk
Article originally posted on InfoQ. Visit InfoQ

Earlier this week, the Xamarin team published the announcement, that Xamarin.Forms and Xamarin.Essentials have been updated to target Android 13, aligning with Google’s requirement for new Android apps to target at least Android 13 starting from August 2023. The update, which is currently in preview, aims to ensure compatibility with the latest Android version and provide a smooth transition to the future of cross-platform development, .NET MAUI.
The update brings Xamarin.Forms and Xamarin.Essentials are in line with the MonoAndroid13 target by default, allowing developers to take advantage of the latest features and improvements offered by Android 13. To facilitate the upgrade process, a preview release has been made available, enabling developers to test their apps and prepare for the transition.
While Xamarin support is set to end on May 1, 2024, developers are encouraged to embrace .NET MAUI, which already supports Android 13. .NET MAUI offers an enhanced ecosystem for cross-platform development, providing a seamless transition from Xamarin to the future of .NET-based development. Detailed upgrade guides and the newly released “.NET Upgrade Assistant” extension for Visual Studio 2022 can assist developers in migrating their apps to .NET MAUI.
The Xamarin.Forms update also includes compatibility enhancements for AndroidX binding packages and other dependencies. While most packages have been updated, Xamarin.AndroidX.Lifecycle
remains at version 2.6.1 due to a known issue under investigation. Developers may encounter warnings in their build output related to the “unknown enum constant Scope.LIBRARY_GROUP_PREFIX
,” and in an announcement post, the author of it, Gerald Versluis states that the issue is currently being investigated.
As reported few steps need to be done in order to begin targeting Android 13 with Xamarin.Forms and Xamarin.Essentials. The first one is that developers need to ensure they have Xamarin.Android 13 installed, update their NuGet packages to the latest preview versions (Xamarin.Forms 5.0.0.2599-pre1 and Xamarin.Essentials 1.8.0-preview1), and update the targetSdkVersion
in their Android project’s AndroidManifest.xml to 33 (Android 13). Lastly, rebuilding the project will allow developers to test the functionality and compatibility of their apps.
As stated earlier, the support for Xamarin products concludes on May 1, 2024, based on that the developers should consider transitioning to .NET MAUI for future cross-platform projects. In the original announcement post, the team states that starting new projects directly with .NET MAUI is recommended while existing Xamarin.Forms projects can be upgraded using the .NET Upgrade Assistant.
The .NET MAUI documentation provides comprehensive resources to assist developers in adopting this next-generation framework.
In addition to an original post, in the comment section, a user named Anderson Damasio expressed excitement about Xamarin.Forms and support which is still getting. On May 15, 2023, the user wrote the following comment:
It’s great to know that there’s an investment in Xamarin Forms. We know that there are a large number of developers with the platform.
Congratulations
With Xamarin.Forms and Xamarin.Essentials now targeting Android 13 (in preview), developers can embrace the latest Android features while preparing for the transition to .NET MAUI. The update ensures compatibility and provides time for developers to migrate their projects. As the preview versions undergo testing, a stable release is expected ahead of Google’s August 2023 deadline. The community members are also invited to visit the official GitHub project repository and learn more and provide valuable feedback.
SingleStore Launches MongoDB API to Power AI and Real-Time Analytics on JSON – Yahoo Finance

MMS • RSS
Posted on nosqlgooglealerts. Visit nosqlgooglealerts
New SingleStore Kai ™ for MongoDB provides 100 -1000x faster analytics and vector functionality for AI for MongoDB applications
SAN FRANCISCO, May 18, 2023–(BUSINESS WIRE)–SingleStore, the cloud-native database built for speed and scale to power real-time applications, today announced the launch of SingleStore Kai™ for MongoDB, a new API that turbocharges real-time analytics on JSON (JavaScript Object Notation) and vector based similarity searches for MongoDB based AI applications — without the need for any query changes or data transformations.
SingleStoreDB is a real-time distributed SQL database combining analytical and transactional workloads in one unified platform. In a new era of the ever increasing adoption of AI, making analytics real time and actionable is even more imperative. A vast majority of data accumulated in the world today is in JSON format, and MongoDB has grown to be one of the most widely adopted NoSQL databases to store and process JSON — powering a variety of use cases across martech, IoT, gaming, logistics, social media, e-commerce and content management applications.
However, document databases are not optimized for analytics, and users often experience delays or lagging query performance attempting to perform analytics on JSON data. SingleStoreDB, by contrast, is architected to power real-time analytics on transactional data, enabling users to drive ultra-fast analytics on both structured and semi-structured (JSON) datasets. The new API is MongoDB wire protocol compatible, and enables developers to power interactive applications with analytics with SingleStoreDB using the same MongoDB commands.
SingleStoreDB can effectively augment MongoDB as the analytical engine to power blazing fast analytics on collections of JSON data. More importantly, this powerful feature is available at no extra cost and is now open for public preview as part of the SingleStoreDB Cloud offering.
Some of the new capabilities of the API include:
– 100x–1,000x faster analytics. With SingleStore Kai, you can perform complex analytics on JSON data for MongoDB applications faster and more efficiently. Based on the latest performance benchmarks, SingleStoreDB was able to drive 100x faster analytical performance for most queries — with some even 1,000x faster compared to MongoDB. The SingleStore MongoDB API proxy translates the MongoDB queries into SQL statements that are executed by SingleStoreDB to drive lightning-fast analytics for your applications.
– AI/vector functionality for JSON. The new era of generative AI requires real-time analytics on all data, including JSON collections. SingleStoreDB supports vectors and fast vector similarity search using dot_product and euclidean_distance functions. And with the launch of SingleStore Kai, developers can now utilize the vector and AI capabilities on JSON collections within MongoDB — powering use cases like semantic search, image recognition, similarity matching and more.
– Simplicity and ease of use. No code changes, data transformations, schema migrations or changes to existing queries. Developers can continue to use existing MongoDB queries and don’t have to normalize or flatten data, or do extensive schema migrations to power fast analytics for their applications.
– Same MongoDB tools and drivers. By supporting the MongoDB wire protocol, SingleStore Kai allows MongoDB clients to communicate with a SingleStoreDB cluster. This means that developers who are familiar with MongoDB can easily power fast analytics on SingleStoreDB without having to learn a new set of tools or APIs — and can continue to use the same MongoDB tools, drivers, skill sets and ecosystem their customers are most familiar with.
– Easy data replication. As part of the MongoDB API offering, SingleStore is also introducing a fast and efficient replication solution (in private preview) that can easily replicate MongoDB collections into SingleStoreDB. This service is natively integrated into SingleStoreDB and leverages one of the most widely used features – SingleStore Pipelines — to drive speedy replication and real-time CDC (Change Data Capture), enabling customers to get started quickly and easily.
– Best of both worlds (NoSQL + SQL). SingleStoreDB is already MySQL wire protocol compatible. With the addition of SingleStore Kai for MongoDB, developers can essentially get the best of both worlds – the schema flexibility and simplicity of a JSON document store together with the speed, efficiency and complex analytical capabilities that only a relational SQL database can provide to power applications.
“The demand for real time analytics is undeniable and critical to today’s economy,” said Raj Verma, CEO, SingleStore. “With SingleStore Kai, we’re enabling any developer using MongoDB’s NoSQL platform to use SingleStore’s SQL analytics data platform, at orders of magnitude improved performance, without changing a line of code.”
“The reality is that MongoDB is not performant enough for powering fast analytics on JSON and joining complex JSON arrays can be both time consuming and costly,” said Yatharth Gupta, SVP Product, SingleStore. “With SingleStore Kai, developers now have a one stop shop to supercharge analytics on their MongoDB applications without having to recode or learn anything new – that’s a huge win-win.”
“This is a fantastic solution and a timely one,” said Kumaran Vijayakumar, CEO and Co-Founder, DataDock Solutions. “At DataDock, we help the world’s largest hedge funds structure products and execute trades. SingleStore helps us make exactly the right data and analytics available to our customers at the right time, in real time, and allows us to handle large volumes of data with ease.”
SingleStoreDB powers real-time data innovation for hundreds of customers including more than 100 Fortune 500, Forbes Global 2000, and Inc. 5000 brands across financial services and fintech, telecom and networking, streaming media, adtech, martech, supply chain logistics, and other verticals. Companies like 6sense, Cisco, Comcast, Dell, Disney, Heap, Hulu, LiveRamp, NBC, Siemens, SiriusXM/Pandora, Sony, Thorn, Uber, Western Digital, and others use SingleStoreDB to fuel real-time customer experience analytics and interactive dashboards.
“SingleStore has consistently demonstrated its ability to innovate and evolve,” said Carl Olofson, Analyst, IDC. “We’ve seen a growing demand in the analytic database market for leveraging a range of data, including JSON documents, together with relational tables in a single system. SingleStore Kai is an outstanding example of such leveraging, as it makes analytics on JSON fast and easy within SingleStore’s traditional SQL system.”
Additional Resources:
-
Try SingleStore Kai for MongoDB
-
Upvote SingleStore Kai for MongoDB on ProductHunt
-
Read the SingleStore blog introducing SingleStore Kai for MongoDB
About SingleStore:
The world’s leading brands rely on data – to make the right business decisions, to deliver exceptional customer experiences and to stay ahead of the competition. This reliance on data brings with it a need for simplicity, speed and scale. SingleStore delivers the world’s fastest distributed SQL database for real-time applications, SingleStoreDB. By combining transactional and analytical workloads, SingleStore eliminates performance bottlenecks and unnecessary data movement to support constantly growing, demanding workloads. Digital giants like Hulu, Uber and Comcast, and many more of the world’s leading SaaS providers, choose SingleStore to unleash the power of their data – supercharging exceptional, real-time data experiences for their customers. Follow us @SingleStoreDB and @SingleStoreDevs on Twitter or visit www.singlestore.com.
View source version on businesswire.com: https://www.businesswire.com/news/home/20230518005338/en/
Contacts
Jessica Rampen
Director of Communications & PR at SingleStore
jrampen@singlestore.com

MMS • RSS
Posted on nosqlgooglealerts. Visit nosqlgooglealerts

U.S (New York)– The newly published report for “NoSQL Market” in the years 2023-2031 comprehensively covers all aspects of the market, presenting the latest information on current trends. It serves as a valuable source of insightful information for business strategists. The report conducts an in-depth assessment of various industry features, including market overview, current development evaluations, historical and future valuation, recent trends, Porter’s five forces analysis, technological advancements, SWOT assessments, and the presence of clients in different regions.
NoSQL (Not Only SQL) is a database mechanism developed for storage, analysis, and access of large volume of unstructured data. NoSQL allows schema-less data storage, which is not possible with relational database storage. The benefits of using NoSQL database include high scalability, simpler designs, and higher availability with more precise control. The ability to comfortably manage big data is another significant reason for the adoption of NoSQL databases. The NoSQL technology is emerging in the database market horizon and is expected to grow rapidly over the next few years.
Download Free Sample of This Strategic Report:https://reportocean.com/industry-verticals/sample-request?report_id=AMR855
This research provides valuable information to enhance understanding, broaden scope, determine future roadmaps, and explore the types and applications discussed in the report. Furthermore, the report includes a detailed study of market segmentation, company profiles, as well as regional and country breakdowns. By reading this research, readers will gain a clear and precise understanding of the market’s overall progress, enabling them to make informed decisions that will prove beneficial.
The rise in social media such as games, blogs, and portals such as Facebook, LinkedIn, and matrimonial sites, has led to surge in semi-structured and unstructured data. NoSQL is the only feasible technology to store and manage this data. The impact of this factor would further increase in future, due to rise in structured and/or unstructured data from applications such as social media, retail transactions, and web applications. Moreover, as NoSQL is the most suitable technology for agile app development, with rise in the app development economy, NoSQL adoption is set for an increase in the coming years, which in turn is expected to garner high market growth.
KEY BENEFITS FOR STAKEHOLDERS
– The study provides an in-depth analysis of the current & future trends of the market to elucidate the imminent investment pockets.
– Information about key drivers, restraints, and opportunities and their impact analysis on the global NoSQL market size is provided.
– Porter’s five forces analysis illustrates the potency of the buyers and suppliers operating in the NoSQL industry.
– The quantitative analysis of the market from 2018 to 2026 is provided to determine the global NoSQL market potential.
KEY MARKET PLAYERS
– Aerospike, Inc.
– Amazon Web Services, Inc.
– DataStax, Inc.
– Microsoft Corporation
– Couchbase, Inc.
– Google LLC
– MarkLogic Corporation
– MongoDB, Inc.
– Neo Technology, Inc.
– Objectivity, Inc.
KEY MARKET SEGMENTS
By Type
– Key-Value Store
– Document Database
– Column-based Store
– Graph Database
By Application
– Data Storage
o Distributed Data Depository
o Cache Memory
o Metadata Store
– Mobile Apps
– Data Analytics
– Web Apps
– Others (E-commerce and Social Networks)
Request To Download Sample of This Strategic Report:–https://reportocean.com/industry-verticals/sample-request?report_id=AMR855
By Industry vertical
– Retail
– Gaming
– IT
– Others
ByRegion
– North America
o U.S.
o Canada
– Europe
o Germany
o France
o UK
o Rest of Europe
– Asia-Pacific
o Japan
o China
o India
o Rest of Asia-Pacific
– LAMEA
o Latin America
o Middle East
o Africa
The Market Report covers various topics including:
- Top competitors in the global and regional market.
- Company profiles of major market participants.
- Technological prowess, future plans, and manufacturing, production, and sales of top manufacturers.
- In-depth explanations of market growth factors and discussions of various end users.
- Major global market application segments.
- SWOT analysis, Porter’s five forces analysis, and Patent analysis.
- Opinions and viewpoints of professionals and industry experts, along with export/import regulations
The report focuses on the following key points:
Market Players and Competitor Analysis: It covers industry players’ profiles, product specifications, production capacity/sales, revenue, price, gross margin, and sales. It provides a comprehensive analysis of the market’s competitive landscape, including information on vendors and factors that may challenge the growth of major market players.
Global and Regional Analysis: The report presents the current status and outlook of the global and regional markets. It offers detailed breakdowns for each region and country, including sales, sales volume, and revenue forecasts. The analysis is further segmented by types and applications.
Market Trends: The report highlights recent market developments that assist organizations in formulating more profitable strategies. It analyzes current trends, allowing customers to understand upcoming products and businesses to plan for improved solutions.
Opportunities and Drivers: The report identifies opportunities and drivers that businesses can leverage by making necessary preparations. It helps stakeholders and report buyers plan their investments and maximize returns.
Porter’s Five Forces Analysis: The report assesses the state of competition in the industry based on five fundamental forces: threat of new entrants, bargaining power of suppliers, bargaining power of buyers, threat of substitute products or services, and existing industry rivalry.
Request full Report :- https://reportocean.com/industry-verticals/sample-request?report_id=AMR855
About Report Ocean:
We are the best market research reports provider in the industry. Report Ocean is the world’s leading research company, known for its informative research reports. We are committed to providing our clients with both quantitative and qualitative research results. As a part of our global network and comprehensive industry coverage, we offer in-depth knowledge, allowing informed and strategic business conclusions to report. We utilize the most recent technology and analysis tools along with our own unique research models and years of expertise, which assist us to create necessary details and facts that exceed expectations.
Get in Touch with Us:
Report Ocean:
Email: sales@reportocean.com
Address: 500 N Michigan Ave, Suite 600, Chicago, Illinois 60611 – UNITED STATES
Tel:+1 888 212 3539 (US – TOLL FREE)
Website: https://reportocean.com

MMS • RSS
Posted on nosqlgooglealerts. Visit nosqlgooglealerts
Solutions Review’s Expert Insights Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Yugabyte Founder and CTO Karthik Ranganathan offers commentary on the evolution of of operational databases.
As enterprises move applications and services to the cloud, they are migrating from big iron scale-up proprietary infrastructure to cloud native architectures. At first glance, traditional database management systems may appear cloud-ready for this critical change, but in reality they still face significant challenges around performance, availability, and simplicity when run at scale. Luckily, the evolution of operational databases has finally reached a point where it is ready to match cloud and modern application layers with a cloud native database. To understand how we got to where we are today, let’s walk through this database evolution and define what “cloud native” really means.
Moving from Monolithic to Distributed SQL – availability and sharding
The foundation of modern transactional databases starts with relational, or Structured Query Language (SQL) databases. These databases have set the standard for Online Transaction Processing (OLTP) for the past four decades. They have proved reliable, but are monolithic in nature, running on a single server. Increasing capacity requires vertical hardware scaling. This can be difficult and costly as it requires more specialized, expensive servers. Their method of copying data, via async or sync replication, can also result in data loss (with async) or downtime (with sync replication). SQL databases weren’t built to scale out to cloud architectures, so these problems are unsurprising.
A significant step towards a viable solution came with sharded and distributed databases – first NoSQL, then NewSQL. However, NoSQL also has its limits. NoSQL databases emphasize scalability and high availability, but compromise on consistency, ACID transactions, and familiar relational data modeling. They are typically used for workloads with very simple data access patterns without requiring transactional guarantees. They are not ideal for business-critical OLTP or HTAP applications that require referential integrity, transactional capabilities and flexible data access patterns using features such as joins.
The tradeoffs between availability (NoSQL) and RDBMS capabilities (NewSQL) force organizations to make a choice. Many early NoSQL adopters (mostly large tech companies) are having second thoughts about pouring more money into that strategy. NewSQL databases are cloud hosted, but not fully cloud native. They represent a critical step in the evolution of databases, but often lead to operational complexity, difficult application development, and inconsistent customer experiences.
Operational Database Evolution
Solving NoSQL Challenges with Distributed SQL
The next stage in the evolution of databases is distributed SQL, which combines the strongest aspects of traditional relational database management systems (RDBMS) with key cloud native capabilities, including scale and availability, made popular by NoSQL databases.
Distributed SQL offers continuous availability, replicating data across nodes and keeping it available regardless of node, zone, region, or data center failures. It allows organizations to scale horizontally on demand without impacting performance and is strongly consistent across geographic zones. It offers developers familiar RDBMS features, allowing them to build data-driven applications, and can strengthen security by providing encryption at rest and in transit.
Although no database reference architecture is perfect for every application, a cloud native distributed SQL database solves many of the challenges that plague NoSQL and other approaches, providing a consistent, versatile, cloud native data layer.
Defining “Cloud Native” for Databases
Cloud native is one of the more recent phases of the operational database evolution. Cloud native is rapidly becoming common practice at the application layer, as enterprises move transactions, microservices, and other applications to dynamic, containerized environments. The same can’t be said of the data layer that supports those applications.
NoSQL and NewSQL databases can be hosted in the cloud and horizontally deployed on cloud infrastructure, but they fall short ‒ in one way or another ‒ of keeping the core database functionality uniform and reliable.
A true cloud native database has five defining characteristics:
- Extreme elasticity, with the ability to scale clusters up and down, both quickly and reliably that allows it to use the virtually limitless resources in the cloud.
- Geo-redundancy and always-on availability to easily create multi-AZ and/or multi-region clusters, and expand or shrink availability zones or regions at any time while remaining resilient to both unplanned failures and planned upgrades. This is important because even the best public cloud services have suffered failures at the zone and region level.
- Simplified management – from hardware flexibility to seamless upgrades allows for seamless movement from one type of compute and storage to another for cost and performance reasons, and to leverage the latest technology advances in the cloud.
- Multi-cloud mobility to avoid cloud lock-in by moving to another cloud provider or co-existing with multiple providers.
- Data placement policies that define and enforce geo-specific data residency controls without impacting applications.
Today, business happens in the cloud. Enterprises will continue to expand operations beyond their own data centers and embrace the public cloud.
While a cloud approach helps organizations operate efficiently and effectively, enterprises need a multi-cloud model that gives them the same cloud native agility they get from microservices and containers. Modern, cloud native operational databases, like distributed SQL databases, fully support those operations, helping enterprises deliver high-value services much faster than legacy databases.

MMS • RSS
Posted on nosqlgooglealerts. Visit nosqlgooglealerts
SAN FRANCISCO, May 18, 2023 — SingleStore today announced the launch of SingleStore Kai for MongoDB, a new API that turbocharges real-time analytics on JSON (JavaScript Object Notation) and vector based similarity searches for MongoDB based AI applications — without the need for any query changes or data transformations.
SingleStoreDB is a real-time distributed SQL database combining analytical and transactional workloads in one unified platform. In a new era of the ever increasing adoption of AI, making analytics real time and actionable is even more imperative. A vast majority of data accumulated in the world today is in JSON format, and MongoDB has grown to be one of the most widely adopted NoSQL databases to store and process JSON — powering a variety of use cases across martech, IoT, gaming, logistics, social media, e-commerce and content management applications.
However, document databases are not optimized for analytics, and users often experience delays or lagging query performance attempting to perform analytics on JSON data. SingleStoreDB, by contrast, is architected to power real-time analytics on transactional data, enabling users to drive ultra-fast analytics on both structured and semi-structured (JSON) datasets. The new API is MongoDB wire protocol compatible, and enables developers to power interactive applications with analytics with SingleStoreDB using the same MongoDB commands.
“The demand for real time analytics is undeniable and critical to today’s economy,” said Raj Verma, CEO, SingleStore. “With SingleStore Kai, we’re enabling any developer using MongoDB’s NoSQL platform to use SingleStore’s SQL analytics data platform, at orders of magnitude improved performance, without changing a line of code.”
SingleStoreDB can effectively augment MongoDB as the analytical engine to power blazing fast analytics on collections of JSON data. More importantly, this powerful feature is available at no extra cost and is now open for public preview as part of the SingleStoreDB Cloud offering.
Some of the new capabilities of the API include:
- 100x-1,000x faster analytics. With SingleStore Kai, you can perform complex analytics on JSON data for MongoDB applications faster and more efficiently. Based on the latest performance benchmarks, SingleStoreDB was able to drive 100x faster analytical performance for most queries — with some even 1,000x faster compared to MongoDB. The SingleStore MongoDB API proxy translates the MongoDB queries into SQL statements that are executed by SingleStoreDB to drive lightning-fast analytics for your applications.
- AI/vector functionality for JSON. The new era of generative AI requires real-time analytics on all data, including JSON collections. SingleStoreDB supports vectors and fast vector similarity search using dot_product and euclidean_distance functions. And with the launch of SingleStore Kai, developers can now utilize the vector and AI capabilities on JSON collections within MongoDB — powering use cases like semantic search, image recognition, similarity matching and more.
- Simplicity and ease of use. No code changes, data transformations, schema migrations or changes to existing queries. Developers can continue to use existing MongoDB queries and don’t have to normalize or flatten data, or do extensive schema migrations to power fast analytics for their applications.
- Same MongoDB tools and drivers. By supporting the MongoDB wire protocol, SingleStore Kai allows MongoDB clients to communicate with a SingleStoreDB cluster. This means that developers who are familiar with MongoDB can easily power fast analytics on SingleStoreDB without having to learn a new set of tools or APIs — and can continue to use the same MongoDB tools, drivers, skill sets and ecosystem their customers are most familiar with.
- Easy data replication. As part of the MongoDB API offering, SingleStore is also introducing a fast and efficient replication solution (in private preview) that can easily replicate MongoDB collections into SingleStoreDB. This service is natively integrated into SingleStoreDB and leverages one of the most widely used features – SingleStore Pipelines — to drive speedy replication and real-time CDC (Change Data Capture), enabling customers to get started quickly and easily.
- Best of both worlds (NoSQL + SQL). SingleStoreDB is already MySQL wire protocol compatible. With the addition of SingleStore Kai for MongoDB, developers can essentially get the best of both worlds – the schema flexibility and simplicity of a JSON document store together with the speed, efficiency and complex analytical capabilities that only a relational SQL database can provide to power applications.
SingleStoreDB powers real-time data innovation for hundreds of customers including more than 100 Fortune 500, Forbes Global 2000, and Inc. 5000 brands across financial services and fintech, telecom and networking, streaming media, adtech,martech, supply chain logistics, and other verticals. Companies like 6sense, Cisco, Comcast, Dell, Disney, Heap, Hulu, LiveRamp, NBC, Siemens, SiriusXM/Pandora, Sony, Thorn, Uber, Western Digital, and others use SingleStoreDB to fuel real-time customer experience analytics and interactive dashboards.
“SingleStore has consistently demonstrated its ability to innovate and evolve,” said Carl Olofson, Analyst, IDC. “We’ve seen a growing demand in the analytic database market for leveraging a range of data, including JSON documents, together with relational tables in a single system. SingleStore Kai is an outstanding example of such leveraging, as it makes analytics on JSON fast and easy within SingleStore’s traditional SQL system.”
Click here to try SingleStore Kai for MongoDB today.
About SingleStore
SingleStore delivers the world’s fastest distributed SQL database for real-time applications, SingleStoreDB. By combining transactional and analytical workloads, SingleStore eliminates performance bottlenecks and unnecessary data movement to support constantly growing, demanding workloads. Digital giants like Hulu, Uber and Comcast, and many more of the world’s leading SaaS providers, choose SingleStore to unleash the power of their data – supercharging exceptional, real-time data experiences for their customers.
Source: SingleStore

MMS • Ben Linders
Article originally posted on InfoQ. Visit InfoQ

Artificial intelligence (AI) can help companies identify new opportunities and products, and stay ahead of the competition. Senior software managers should understand the basics of how this new technology works, why agility is important in developing AI products, and how to hire or train people for new roles.
Zorina Alliata spoke about leading AI change at OOP 2023 Digital.
In recent studies, 57% of companies said they will use AI and ML in the next three years, Alliata explained:
Chances are, your company already uses some form of AI or ML. If not, there is a high chance that they will do so in the very near future in order to stay competitive.
Alliata mentioned that AI and ML are increasingly being used in a variety of industries, from movie recommendations to self-driving cars, and are expected to have a major impact on businesses in the coming years.
Software leaders should be able to understand how the delivery of ML models is different from regular software development. To manage the ML development process correctly, it is important to have agility by using a methodology that allows for quick pivots, iterations, and continuous improvement, Alliata said.
According to Alliata, software leaders should be prepared to hire or train for new roles such as data scientist, data engineer, ML engineer. She mentioned that such roles might not yet exist in current software engineering teams, and they require very specific skills.
InfoQ interviewed Zorina Alliata about adopting AI and ML in companies.
InfoQ: Why should companies care about artificial intelligence and machine learning?
Zorina Alliata: AI and ML can help companies to make better decisions, increase efficiency, and reduce costs. With AI and ML they can automate repetitive processes and improve the customer experience significantly.
A few years ago when I had a fender bender with my car, I had to communicate with my insurance company through phone calls, and take time off work to take my car to specific repair shops. Just last year when my teenage son bumped his car in the parking lot, he used his mobile app to communicate with the insurance company right away, upload images of the car damage, get a rental car, and arrange for his car to be dropped off for repairs by a technician. He could see the status of the repairs online, he received automatic reports and his car was delivered at home when fixed. Behind his pleasant experience, there was a lot of AI and ML – image recognition, chatbots, sentiment analysis.
Another thing companies can benefit from is mining insights from data. For example, looking at all your sales data, the algorithms might find patterns that were not previously known. A common use for this is in segmenting and clustering populations in order to better define a focused message. If you can cluster all people with a high propensity to buy a certain type of insurance policy, then your marketing campaigns can be much more effective.
InfoQ: What should senior software managers know about artificial intelligence and machine learning?
Alliata: Let me give you an example. We sometimes do what we call unsupervised learning – that is, we analyse huge quantities of data just to see what patterns we can find. There is no clear variable to optimize, there is no defined end result.
Many years ago, I read about this airline that used unsupervised learning on their data and the machine came back with the following insight: it found that people who were born on a Tuesday were more likely to order vegetarian meals on a flight. This was not a question anyone had posed, or an insight anyone was ready for.
As a software development manager, how do you plan for whatever weird or amazing insight the algorithms will deliver? We just might not even know what we are looking for until later in the project. This is very different from regular software development where we have a very clear outcome stated from the beginning, for example: display all flyers and their meals on a webpage.
InfoQ: What can companies do to prepare themselves for AI adoption?
Alliata: Education comes first. As a leader, you should understand what the benefits of using AI and ML are for your company, and understand a bit about how the technology works. Also, it is your task to communicate and openly discuss how AI will change the work and how it will affect the people in their current jobs.
Having a solid strategy and a solid set of business use cases that will provide real value is a great way to get started, and to use as your message and vision.
Promoting lean budgeting and agile teams will help quickly show value before large investments in AI resources and technology are made.
Establishing a culture of continuous improvement and continuous learning is also necessary. The technology is changing constantly and the development teams need time to keep up with the newest research and innovation.
First Real-Time Virtual Database Platform to Deliver Support for Both SQL and noSQL Data …

MMS • RSS
Posted on nosqlgooglealerts. Visit nosqlgooglealerts

Boston, MA, May 18, 2023 –(PR.com)– First Real-time Virtual Database Platform to Deliver Support for both SQL and noSQL data source access as a single view
Looks and feels like a relational database
Support for complex data, including JSON and COBOL
Data policy support
Enhanced security based on Hippocratic security
Provides both SQL and noSQL APIs
Core product features are expanded with add-ons
dbSpaces Ltd. announced today the general availability of dbSpaces CORE v1.0, dbSpaces MongoSQL v1.0 and dbSpaces COBOL v1.0 servers, after 60+ man years of effort.
dbSpaces allows organizations to orchestrate and unify their data into a single view regardless of location and format in real-time allowing them to perform ETL, Reporting/Business Intelligence, Business Processes, or Artificial Intelligence requests against data assets stored in SQL or noSQL databases.
dbSpaces does not provide standard or enterprise servers, dbSpace provides a single CORE server that can be expanded with add-ons, allowing companies to only purchase what they need.
Current add-ons available include; Advanced Services, Events & Alerts, Data Policy, Data Sensitivity, Data Orchestration and Hippocratic Security.
dbSpaces does provide dedicated servers for MongoDB and COBOL, providing SQL access to developers and organizations who only want to access these data sources.
dbSpaces allows COBOL modernization allowing applications to store their COBOL data in supported data source. COBOL applications can also access data stored in other data source and supported message queues.
dbSpaces Hippocratic security provides the ability to restrict access to data using a finer grain of control using straight forward SQL based commands. In dbSpaces as well as having the ability to create Roles you are able to create both Purpose and Recipient privileges for accessing data.
Advanced services provides support for multiple database instances, replication, table partitions, enhanced server and user access controls. Access to backend data sources can be restricted by day and time.
dbSpaces is a game changer for organizations wanting to securely access all of their valuable data assets. Not based on complex frameworks, but on a virtual database that all organizations already understand.
For further information contact the market department at marketing_dept@dbspaces.com
First Real-Time Virtual Database Platform to Deliver Support for Both SQL and noSQL Data …

MMS • RSS
Posted on nosqlgooglealerts. Visit nosqlgooglealerts

Boston, MA May 18, 2023 –(PR.com)– First Real-time Virtual Database Platform to Deliver Support for both SQL and noSQL data source access as a single view
Looks and feels like a relational database
Support for complex data, including JSON and COBOL
Data policy support
Enhanced security based on Hippocratic security
Provides both SQL and noSQL APIs
Core product features are expanded with add-ons
dbSpaces Ltd. announced today the general availability of dbSpaces CORE v1.0, dbSpaces MongoSQL v1.0 and dbSpaces COBOL v1.0 servers, after 60+ man years of effort.
dbSpaces allows organizations to orchestrate and unify their data into a single view regardless of location and format in real-time allowing them to perform ETL, Reporting/Business Intelligence, Business Processes, or Artificial Intelligence requests against data assets stored in SQL or noSQL databases.
dbSpaces does not provide standard or enterprise servers, dbSpace provides a single CORE server that can be expanded with add-ons, allowing companies to only purchase what they need.
Current add-ons available include; Advanced Services, Events & Alerts, Data Policy, Data Sensitivity, Data Orchestration and Hippocratic Security.
dbSpaces does provide dedicated servers for MongoDB and COBOL, providing SQL access to developers and organizations who only want to access these data sources.
dbSpaces allows COBOL modernization allowing applications to store their COBOL data in supported data source. COBOL applications can also access data stored in other data source and supported message queues.
dbSpaces Hippocratic security provides the ability to restrict access to data using a finer grain of control using straight forward SQL based commands. In dbSpaces as well as having the ability to create Roles you are able to create both Purpose and Recipient privileges for accessing data.
Advanced services provides support for multiple database instances, replication, table partitions, enhanced server and user access controls. Access to backend data sources can be restricted by day and time.
dbSpaces is a game changer for organizations wanting to securely access all of their valuable data assets. Not based on complex frameworks, but on a virtual database that all organizations already understand.
For further information contact the market department at marketing_dept@dbspaces.com
Contact Information:
dbSpaces Ltd.
Scott Jones
+44 781641640
Contact via Email
www.dbspaces.com
Read the full story here: https://www.pr.com/press-release/887175
Press Release Distributed by PR.com
Copyright © 2023 PR.com and its licensors
, source US Press Releases