Month: June 2023
AI, ML, Data Engineering News Round Up: Vertex, AlphaDev, Function Calling, Gorilla, and Falcon
MMS • Daniel Dominguez
Article originally posted on InfoQ. Visit InfoQ
The latest update, spanning from June 12th, 2023, highlights the recent advancements and announcements in the domains of data science, machine learning, and artificial intelligence. This week’s spotlight is on notable entities such as Google, OpenAI, UC Berkeley, and AWS.
Generative AI Support on Vertex AI Is Now Generally Available
Google has introduced the availability of Generative AI support on Vertex AI, enabling customers to utilize the latest platform features in creating and operating personalized generative AI applications. This update empowers developers to utilize various tools and resources, including the PaLM 2 powered text model, the Embeddings API for text, and foundational models in Model Garden. Additionally, the Generative AI Studio offers user-friendly tools for fine-tuning and deploying models. With the backing of enterprise-level data governance, security, and safety measures, Vertex AI simplifies the process for customers to access foundational models, customize them with their own data, and swiftly develop generative AI applications.
DeepMind Introduces AlphaDev
Google Deepmind has unveiled AlphaDev, an artificial intelligence system that leverages reinforcement learning to uncover improved computer science algorithms, surpassing the ones developed by scientists and engineers over many years. AlphaDev has discovered a more efficient sorting algorithm, which is used for organizing data. These algorithms play a fundamental role in various aspects of our daily lives, from ranking online search results and social media posts to data processing on computers and smartphones. The utilization of AI to generate superior algorithms is poised to revolutionize computer programming and have a significant impact on all facets of our ever-growing digital society.
OpenAI Announces Function Calling
OpenAI has introduced updates to the API, including a capability called function calling, which allows developers to describe functions to GPT-4 and GPT-3.5 and have the models create code to execute those functions. Function calling facilitates the development of chatbots capable of leveraging external tools, transforming natural language into database queries, and extracting structured data from text. These models have undergone fine-tuning to not only identify instances where a function should be invoked but also provide JSON responses that align with the function signature.
UC Berkeley Develops Gorilla, an LLM Connected with APIs
Researchers from the UC Berkeley released Gorilla, a finely-tuned model based on LLaMA, that surpasses GPT-4 in terms of performance when generating API calls. Its integration with a document retriever allows Gorilla to effectively adapt to changes in documents during testing, enabling flexible API updates and version changes. Moreover, Gorilla significantly addresses the issue of hallucination often encountered when directly prompting LLMs. To assess the model’s capabilities, APIBench, a comprehensive dataset comprising HuggingFace, TorchHub, and TensorHub APIs, has been introduced. The successful combination of the retrieval system with Gorilla showcases the potential for LLMs to utilize tools more accurately, stay updated with frequently changing documentation, and enhance the reliability and practicality of their outputs.
AWS Introduces Falcon, a Foundational LLM Built and Trained with Sagemaker
AWS and the UAE Technology Innovation Institute have launched Falcon LLM, a foundational large language model with 40 billion parameters. Falcon matches the performance of other high-performing LLMs, and is the top-ranked open-source model in the public Hugging Face Open LLM leaderboard. It’s available as open-source in two different sizes, Falcon-40B and Falcon-7B and was built from scratch using data preprocessing and model training jobs built on Amazon SageMaker. Open-sourcing Falcon 40B enables users to construct and customize AI tools that cater to unique users needs, facilitating seamless integration and ensuring the long-term preservation of data assets. The model weights are available to download, inspect and deploy anywhere.
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
HTF MI introduces new research on Real-Time Analytics covering the micro level of analysis by competitors and key business segments. The Real-Time Analytics explores a comprehensive study of various segments like opportunities, size, development, innovation, sales, and overall growth of major players. The research is carried out on primary and secondary statistics sources and it consists of both qualitative and quantitative detailing. Some of the major key players profiled in the study are Microsoft, SAP, Oracle, IBM, Informatica, Amdocs, Infosys, Google, Impetus Technologies & MongoDB
Acquire Sample Report + All Related Table and Graphs@: https://www.htfmarketreport.com/sample-report/3751925-real-time-analytics-market-1
On the off chance that you are engaged with the industry or expect to be, at that point this investigation will give you a complete perspective. It’s crucial you stay up with the latest sectioned by Applications [BFSI, Manufacturing, Media and Entertainment, Government, Retail and Wholesale, Military, Warehouses & Scientific Analysis], Product Types, [, Processing in Memory, In-Database Analytics, Data Warehouse Appliances, In-Memory Analytics & Massively Parallel Programming] and some significant parts of the business
.
For more data or any query mail at sales@htfmarketreport.com
Which market aspects are illuminated in the report?
Executive Summary: It covers a summary of the most vital studies, the Real-Time Analytics market increasing rate, modest circumstances, market trends, drivers and problems as well as macroscopic pointers.
Study Analysis:Covers major companies, vital market segments, the scope of the products offered in the Real-Time Analytics market, the years measured, and the study points.
Company Profile: Each Firm well-defined in this segment is screened based on a product’s, value, SWOT analysis, ability, and other significant features.
Manufacture by region: This Real-Time Analytics report offers data on imports and exports, sales, production, and key companies in all studied regional markets
Highlighted of Real-Time Analytics Market Segments and Sub-Segment:
Real-Time Analytics Market by Key Players: Microsoft, SAP, Oracle, IBM, Informatica, Amdocs, Infosys, Google, Impetus Technologies & MongoDB
Real-Time Analytics Market by Types: , Processing in Memory, In-Database Analytics, Data Warehouse Appliances, In-Memory Analytics & Massively Parallel Programming
Real-Time Analytics Market by End-User/Application: BFSI, Manufacturing, Media and Entertainment, Government, Retail and Wholesale, Military, Warehouses & Scientific Analysis
Real-Time Analytics Market by Geographical Analysis: North America, US, Canada, Mexico, Europe, Germany, France, U.K., Italy, Russia, Nordic Countries, Benelux, Rest of Europe, Asia, China, Japan, South Korea, Southeast Asia, India, Rest of Asia, South America, Brazil, Argentina, Rest of South America, Middle East & Africa, Turkey, Israel, Saudi Arabia, UAE & Rest of Middle East & Africa
Get Year End Discount (10-40% off) at Real-Time AnalyticsMarket Report https://www.htfmarketreport.com/request-discount/3751925-real-time-analytics-market-1
The study is a source of reliable data on Market segments and sub-segments, Market trends and dynamics Supply and demand Market size Current trends/opportunities/challenges Competitive landscape Technological innovations Value chain, and investor analysis.
Interpretative Tools in the Market: The report integrates the entirely examined and evaluated information of the prominent players and their position in the market by methods for various descriptive tools. The methodical tools including SWOT analysis, Porter’s five forces analysis, and investment return examination were used while breaking down the development of the key players performing in the market.
Key Growths in the Market: This section of the report incorporates the essential enhancements of the marker that contains assertions, coordinated efforts, R&D, new item dispatch, joint ventures, and associations of leading participants working in the market.
Key Points in the Market: The key features of this Real-Time Analytics market report includes production, production rate, revenue, price, cost, market share, capacity, capacity utilization rate, import/export, supply/demand, and gross margin. Key market dynamics plus market segments and sub-segments are covered.
Basic Questions Answered
*who are the key market players in the Real-Time Analytics Market?
*Which are the major regions for dissimilar trades that are expected to eyewitness astonishing growth for the
*What are the regional growth trends and the leading revenue-generating regions for the Real-Time Analytics Market?
*What are the major Segment by Types for Real-Time Analytics?
*What are the major applications of Real-Time Analytics?
*Which Real-Time Analytics technologies will top the market in the next decade?
Examine Detailed Index of full Research Study at@: https://www.htfmarketreport.com/reports/3751925-real-time-analytics-market-1
Table of Content
Chapter One: Industry Overview
Chapter Two: Major Segmentation (Classification, Application and etc.) Analysis
Chapter Three: Production Market Analysis
Chapter Four: Sales Market Analysis
Chapter Five: Consumption Market Analysis
Chapter Six: Production, Sales, and Consumption Market Comparison Analysis
Chapter Seven: Major Manufacturer’s Production and Sales Market Comparison Analysis
Chapter Eight: Competition Analysis by Players
Chapter Nine: Marketing Channel Analysis
Chapter Ten: New Project Investment Feasibility Analysis
Chapter Eleven: Manufacturing Cost Analysis
Chapter Twelve: Industrial Chain, Sourcing Strategy, and Downstream Buyers
Buy the Full Research report of Real-Time Analytics Market@: https://www.htfmarketreport.com/buy-now?format=1&report=3751925
Thanks for reading this article; you can also get individual chapter-wise sections or region-wise report versions like North America, LATAM, Europe, or Southeast Asia.
About Author:
HTF Market Report is a wholly owned brand of HTF market Intelligence Consulting Private Limited. HTF Market Report global research and market intelligence consulting organization is uniquely positioned to not only identify growth opportunities but also empower and inspire you to create visionary growth strategies for futures, enabled by our extraordinary depth and breadth of thought leadership, research, tools, events and experience that assist you for making goals into a reality.
Contact Us :
Craig Francis (PR & Marketing Manager)
HTF Market Intelligence Consulting Private Limited
Phone: +1 434 322 0091
sales@htfmarketreport.com
Article originally posted on mongodb google news. Visit mongodb google news
ASP.NET Core in .NET 8 Preview 5: Improved Debugging, Blazor Updates, SignalR Reconnects, and More
MMS • Almir Vuk
Article originally posted on InfoQ. Visit InfoQ
The latest release of .NET 8 Preview 5 brings significant additions to ASP.NET Core. Notable enhancements include an improved debugging experience for ASP.NET Core, changes regarding the servers and middleware, the introduction of new features and improvements in Blazor, enhanced API authoring capabilities, seamless reconnect functionality in SignalR, and improvements and changes in authentication and authorization.
Regarding productivity notable advancements have been made to enhance the debugging experience in ASP.NET Core. Specifically, developers will benefit from the introduction of debug customization attributes that facilitate the retrieval of crucial information related to types such as HttpContext, HttpRequest, HttpResponse, and ClaimsPrincipal within the Visual Studio debugger.
In the latest .NET 8 preview 5, developers can experience early support for “seamless reconnects” in SignalR. This new feature aims to minimize downtime for clients facing temporary network disruptions, such as network switches or tunnel passages. By temporarily buffering data on both the server and client sides and acknowledging messages, it ensures a smoother user experience. Currently, this support is limited to .NET clients using WebSockets, and configuration options are not yet available. Developers have the freedom to opt-in to this feature and tweak around options.UseAcks
, at HubConnectionBuilder. Upcoming previews are expected to introduce server-side configuration, customizable buffering settings, timeout limits, and expanded support for other transports and clients.
Blazor has also received a significant number of updates in the latest release of .NET 8 Preview 5. Updates like the new Blazor Web App template available through the command line and within the Visual Studio, webcil is now the default packaging format when publishing a Blazor WebAssembly app is being done, and regarding the Blazor WebAssembly, there is no longer requirement for unsafe-eval
to be enabled, while specifying a Content Security Policy (CSP).
Also, the Blazor Router component now integrates with endpoint routing to handle both server-side and client-side routing. This integration allows for consistent routing to components regardless of whether server-side or client-side rendering is employed. The new Blazor Web App template includes sample pages, such as Index.razor
and ShowData.razor
, which utilize endpoint routing and streaming rendering for displaying weather forecast data, with enhanced navigation support expected in future .NET 8 previews.
Blazor Server introduces the ability to enable interactivity for individual components. With the new [RenderModeServer]
attribute, developers can activate interactivity for specific components by utilizing the AddServerComponents extension method. This enhancement offers greater flexibility and control when building interactive applications with Blazor Server rendering mode.
The comment section of the original release blog post has generated significant activity, with users engaging in numerous questions and discussions with the development team. Developers are encouraged to explore the comment section for further information and insights.
Generic attributes were introduced in C# 11 and now regarding updates for API authoring, there is support added for generic attributes, providing cleaner alternatives to attributes that previously relied on a System.Type
parameter. Generic variants are now available for the following attributes: ProducesResponseType, Produces, MiddlewareFilter, ModelBinder, ModelMetadataType, ServiceFilter, and TypeFilter.
Authentication and authorization, have also seen some changes, ASP.NET Core React and Angular project templates have removed the dependency on Duende IdentityServer. Instead, these templates now utilize the default ASP.NET Core Identity UI and cookie authentication to handle authentication for individual user accounts. Also, a new Roslyn analyzer is introduced in this preview to facilitate the adoption of a more “terser” syntax using the AddAuthorizationBuilder API, where applicable.
Other notable changes include the servers & middleware area, the introduction of the IHttpSysRequestTimingFeature
interface allows for the detailed info of timestamp data during request processing when utilizing the HTTP.sys server. Additionally, the ITlsHandshakeFeature interface now exposes the Server Name Indication (SNI) hostname information. The addition of IExceptionHandler
interface, enables services to be resolved and invoked by the exception handler middleware in order to provide developers with a callback mechanism to handle known exceptions in a centralized location.
Furthermore, regarding Native AOT, the latest preview incorporates enhancements to minimal APIs generated at compile-time. These improvements include support for parameters adorned with the AsParameters attribute and the automatic inference of metadata for request and response types.
Lastly, developers are welcome to leave feedback and follow the progress of the ASP.NET Core in .NET 8 by visiting the official GitHub project repository.
Canonical Sunbeam Aims to Simplify Migrating from Small-Scale Legacy IT Solutions to OpenStack
MMS • Sergio De Simone
Article originally posted on InfoQ. Visit InfoQ
Canonical has announced a new open-source project to enable organizations to transition their small-scale proprietary IT solutions to OpenStack. Named Sunbeam, the project is free of charge and does not require an expensive professional services engagement, says Canonical.
Sunbeam’s goal is to enable the deployment of OpenStack in hybrid contexts where both Kubernetes and native nodes coexist. Thanks to its ability to run OpenStack inside containers, Sunbeam can be seen as a sort of OpenStack on Kubernetes on steroids. Sunbeam leverages Canonical Juju to orchestrate and manage nodes in multiclouds or hybrid clouds using charms operators.
Charms are the basic encapsulation unit for business logic, providing a wrapper around an application containing all the instructions for deploying, configuring, and scaling it. Sunbeam adopts native Kubernetes concepts such as StatefulSets and operators, thus making it possible to deploy and operate OpenStack in a way similar to other Kubernetes deployments.
Another basic tenet of Sunbeam are relation handlers that mediate between charms and interfaces to create an intermediate abstraction which allows the charm to interact in a consistent way with a diverse range of interfaces. Relation handlers provide, for example, a ready
property which tells the charms whether all data has been received at the interface. On the opposite end, container handlers mediate between charms and pebble containers to enable configuring the container, restarting it, inspecting its running status, and so on.
While Sunbeam is actually able to support OpenStack operation and management at any scale, from single nodes and small deployments on the edge to large clouds including thousands of hypervisors, Canonical is specifically targeting it to the initial phase in adopting OpenStack, such as when transitioning a small-scale legacy IT solution, as Canonical product manager Tytus Kurek confirms:
Sunbeam emerged to remove numerous barriers around the initial adoption of OpenStack and is just the first step towards an autonomous private cloud.
As a direct consequence of this vision, Sunbeams strives to provide a simple interface aiming to be friendly for non-OpenStack savvy customers, who can bootstrap and OpenStack cloud in minutes, claims Canonical.
At the moment, Sunbeam ships with the latest OpenStack version, 2023.1 but it only includes core OpenStack services, although Canonical says it will evolve quickly.
MMS • RSS
Posted on nosqlgooglealerts. Visit nosqlgooglealerts
Amazon DynamoDB is a popular NoSQL database service offered by Amazon Web Services (AWS). It provides low-latency and high-performance access to data without requiring the user to manage the underlying hardware and software infrastructure. This database service is highly scalable, and it can handle multiple terabytes of data with ease. In this article, we will explore the features, benefits, use cases, and performance advantages of Amazon DynamoDB.
Features of Amazon DynamoDB
Amazon DynamoDB has several key features that set it apart from other NoSQL databases:
- Scalability: DynamoDB automatically scales horizontally to handle more traffic and data without manual intervention. It can handle tables with hundreds of terabytes of data and billions of items.
- Availability: DynamoDB offers automatic failover and multi-region replication, making it a highly available database service. It can survive the failure of a data center or a server with minimal disruption to the system.
- Performance: DynamoDB is designed to provide low latency access to data, with predictable and consistent performance at any scale. It can handle millions of read and write requests per second without any degradation in performance.
- Security: DynamoDB provides several security features such as encryption at rest, encryption in transit, fine-grained access control using IAM and VPC, and auditing through CloudTrail.
- Flexibility: DynamoDB supports a wide range of data models, including document data, key-value pairs, and graph data. It also allows users to create secondary indexes and to define their own data access patterns.
Benefits of Amazon DynamoDB
Amazon DynamoDB offers several benefits to users who are looking for a highly scalable and available NoSQL database service:
- Seamless scalability: DynamoDB scales automatically to handle more traffic and data without the need for manual intervention. It can handle massive data sets and unpredictable traffic patterns without any downtime or degradation.
- High availability: DynamoDB offers automatic failover and multi-region replication, making it a highly available database service. It can survive the failure of a data center or a server with minimal disruption to the system.
- Low latency: DynamoDB is designed to provide fast and consistent access to data, with predictable performance at any scale. It can handle millions of read and write requests per second without any degradation in performance or throughput.
- Cost-effective: DynamoDB has a pay-per-use pricing model, which means that users only pay for the resources they consume. This makes it a cost-effective NoSQL database service compared to other options in the market.
- Developer-friendly: DynamoDB provides a simple and intuitive API that allows developers to access data using a variety of programming languages and tools. It also provides real-time metrics and tools for debugging and monitoring.
Use Cases for Amazon DynamoDB
Amazon DynamoDB is a versatile and powerful NoSQL database service that can be used for a wide range of use cases, including:
Web and mobile applications
DynamoDB is ideal for web and mobile applications that require low latency and high throughput. It can handle millions of concurrent users and high volumes of data with ease. Applications that require real-time updating of user-generated content, such as social networks, gaming platforms, and chat applications, are a good fit for DynamoDB.
Internet of Things (IoT) applications
DynamoDB can be used for IoT applications that generate massive amounts of data streams. It can handle real-time data ingestion and processing, as well as historical data storage and analytics. Applications such as smart homes, connected cars, and industrial equipment monitoring can benefit from the scalability and availability of DynamoDB.
eCommerce applications
DynamoDB is well-suited for eCommerce applications that require high availability and low latency. It can handle high volumes of transactions and real-time inventory management. Online marketplaces, retail websites, and booking platforms are examples of eCommerce applications that can benefit from the simplicity and scalability of DynamoDB.
Gaming applications
DynamoDB is an excellent choice for gaming applications that require real-time user interactions, high performance, and low latency. It can handle massive player profiles, real-time game states, and transactional updates. Multiplayer games, mobile games, and online gaming platforms are examples of gaming applications that can benefit from the performance and scalability of DynamoDB.
Adtech applications
DynamoDB can be used for adtech applications that require real-time bidding, campaign management, and user targeting. It can handle billions of ad impressions and clicks per day, as well as extensive user profiling and segmentation. Advertising networks, demand-side and supply-side platforms, and digital agencies are examples of adtech applications that can benefit from the scalability and performance of DynamoDB.
Performance Advantages of Amazon DynamoDB
Amazon DynamoDB provides several performance advantages over other NoSQL database services:
Consistent and predictable performance
DynamoDB is designed to provide consistent and predictable performance, regardless of the scale or type of workload. It can handle up to 20 million reads and 10 million writes per second per table, with an average latency of less than 10 milliseconds. This makes it an ideal choice for applications that require low-latency and high-throughput access to data.
Automatic scaling
DynamoDB scales automatically to handle changes in traffic and data volume. It can scale up or down in response to changes in demand, without any manual intervention. This makes it a highly scalable and efficient database service that can handle unpredictable workloads without any downtime or performance degradation.
Highly available and durable
DynamoDB offers automatic failover and multi-region replication, making it a highly available and durable database service. It can survive the failure of a data center or a server with minimal disruption to the system. It also provides backup and restore capabilities, enabling users to recover data in case of accidental deletion or corruption.
Flexible data modeling
DynamoDB supports multiple data models, including document data, key-value pairs, and graph data. It also allows users to create secondary indexes and to define their own data access patterns. This makes it a flexible and versatile database service that can handle a wide range of data structures and use cases.
Amazon DynamoDB is a highly scalable, available, and performant NoSQL database service provided by Amazon Web Services. It offers several key features, benefits, and performance advantages that make it an ideal choice for web and mobile applications, IoT applications, eCommerce applications, gaming applications, and adtech applications. With its automatic scaling, consistent and predictable performance, and flexible data modeling, DynamoDB provides a reliable and cost-effective alternative to traditional relational databases and other NoSQL database services.
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
MongoDB (NASDAQ:MDB), a development and general-purpose database company for consulting and training, has been extremely buzzworthy recently. Analysts have continued to give the company a Strong Buy rating due to potential new clients in needs of servers in the rapidly growing AI industry, despite overall average price targets of $367.37 indicating a 3.30% downside. In the past 3 months, 20 analysts have rated MDB and it has received 17 Buy, 2 Hold, and 1 Sell ratings.
Reasons Analysts are Recommending MongoDB
There are a multitude of reasons why analysts are recommending MongoDB despite the current implied downside. AI is growing unlike ever before. According to Grand View Research, between 2023 and 2030 the AI Market is expected to grow at a CAGR of 37.3%, ultimately reaching $1.811 trillion in 2030. The rise of AI platforms has created a dire need for better software capabilities, which is where MongoDB is able to step in and provide server solutions. MongoDB has various products including Atlas, Enterprise Advanced, and Community Edition to provide server solutions to all customers and adapt to their AI needs.
For example, Concured, an AI startup focused on content personalization, has decided to use MongoDB as their company’s server to launch new products and platforms. The MongoDB Atlas Serverless platform that MDB sells allows customers to confidently grow their businesses and take on the tech industry. There are many more opportunities similar to Concured for MongoDB to expand, causing investors to see high potential for growth despite the downside from price targets.
MongoDB vs. Competitors
After recently reporting earnings on June 1, 2023, MongoDB’s EPS soared to .56, which was significantly above the forecasted EPS of .18. Additionally, the quarterly revenue is now reporting at $368.28M for April 2023, opposed to the previous report of $361.31M for January 2023, showing continued growth. The company has also seen a yearly gain of 52.11%, which is significantly higher than the growth of competitors CrowdStrike (CRWD) (-5.34%), Alteryx (AYX) (-3.07%), Okta (OKTA) (-8.37%), and Salesforce (CRM) (29.71%). Furthermore, MongoDB’s recent earnings report ultimately gives analysts an indication of a positive future ahead despite the implied downside.
Conclusion
Overall, MongoDB is showing signs of positive returns despite the low implied downside. In regard to blogger sentiment, MDB sentiment is currently at 84%, while the sector average is only 67%, meaning that the stock is more bullish than the sector average. Additionally, according to TipRanks’ technical analysis MongoDB is considered a Buy. Moreover, this stock is continuously favored by analysts and has a high potential for future growth.
Article originally posted on mongodb google news. Visit mongodb google news
MMS • Matt Campbell
Article originally posted on InfoQ. Visit InfoQ
GitHub has moved push protection into general availability and made it free for all public repositories. Push protection helps detect secrets in code as changes are pushed. As part of the GA release, push protection is also available to all private repositories with a GitHub Advanced Security (GHAS) license.
If code is pushed that contains a secret, push protection will trigger a prompt indicating the secret type, location, and steps to remediate. These prompts occur inline with the developer experience, either in the IDE or CLI. According to Zain Malik, Senior Product Marketing Manager at GitHub, and Mariam Sulakian, Product Manager at GitHub, “push protection only blocks secrets with low false positive rates.” A full list of secrets supported by push protection is available within the GitHub docs.
Push protection can be bypassed if needed by providing a reason. The options presented include marking the secret as needed for a test, marking it as a false positive, and marking it to be fixed later. Bypassing push protection will automatically trigger an email alert to repository and organization administrators as well as defined security managers. All bypasses can be reviewed via audit logs, the alert view UI, the REST API, or via webhook events. If marked as “fix later”, an open security alert is created. In all other cases, a closed and resolved security alert is created.
Push protection can be enabled via the Code security and analysis settings. It is possible to have push protection enabled automatically for all new public and GHAS-enabled private repositories. A custom resource link can also be specified that will appear in the CLI and web UI when push protection blocks a commit.
Custom patterns can be defined for push protection to scan for and block. It is recommended to first test custom patterns using the built-in dry-run feature before publishing and enabling the pattern. The pattern is specified as a regular expression.
User greysteil noted on Hacker News that they worked on this feature while at GitHub. They shared that:
This release is a repo-level setting, which is nice, but it will be even more useful when the team releases a user-level setting in June/July. That will allow you to configure GitHub to (softly) prevent you from pushing any easily identifiable secrets to any public repo. The plan is for it to be on by default.
They continued by sharing that approximately 200 new GitHub personal access tokens (PAT) are exposed in public repositories daily. User darthbanane raised a concern that if the scanner detects a secret then that implies that the secret has already left the user’s machine and has traversed the internet. User awesome_dude replied that:
The scanner has seen the credentials, yes, and it’s then up to the individual to decide if that credential should be considered “compromised” or not (seeing as the GitHub scanner has seen that credential).
In response to a query about how GitHub is performing the scan, greysteil noted that “it’s a bespoke scanning setup designed to deal with GitHub’s scale, minimise false positives, and scan fast enough to be in the `git push` request/response cycle.” They continued by sharing that it is leveraging Intel’s Hyperscan as the regex engine.
GitHub push protection is available free of charge to all public repositories. It is available for use in private repositories as part of GitHub Advanced Security.
MMS • RSS
Posted on nosqlgooglealerts. Visit nosqlgooglealerts
As businesses are increasingly relying on data to drive their decision-making process, there has been a surge in data-intensive applications. This necessitates robust and scalable databases that can efficiently manage large quantities of data, support highly-available and performant applications and provide real-time analysis and insights. AWS provides a suite of cloud-based database solutions that offer organizations of all sizes, the flexibility, scalability and reliability required to manage their data effectively.
AWS Database Options
AWS offers a variety of database solutions which can be classified into two categories, Relational Database Services (RDS) and NoSQL Databases.
Relational Database Services (RDS)
RDS is a cloud-based relational database service that makes it easy to set up, operate, and scale a relational database. RDS automates time-consuming administrative tasks whether they are typical maintenance, backups or patching. RDS supports multiple relational database engines such as MySQL, PostgreSQL, Microsoft SQL Server, Oracle, and Aurora. RDS provides the following benefits:
Ease of Use and Management: RDS simplifies the setup of a relational database by automating the deployment, scaling and maintenance of the infrastructure. RDS takes care of all administrative tasks such as patching, backing up, and monitoring.
Scalability: With RDS, you can gracefully scale vertically or horizontally, depending on your needs. Vertical scaling allows you to add more compute and memory resources, whereas horizontal scaling enables you to add more nodes to the cluster.
High Availability: RDS provides multiple features, including automated backups, snapshots, and replicas. These enable you to recover data easily in the event of a disaster.
NoSQL Databases
NoSQL databases provide a flexible and scalable database solution that can handle a huge amount of structured and unstructured data. NoSQL databases are often a preferred choice for applications such as e-commerce, content management systems, social media sites, and IoT. AWS provides two databases that fit under this category; Amazon DynamoDB and Amazon DocumentDB.
Amazon DynamoDB: Amazon DynamoDB is a fully-managed, highly scalable, and secure NoSQL database that supports both document and key-value data models. DynamoDB is designed to deliver fast, predictable performance for applications that need to handle large amounts of data and requests throughput. DynamoDB provides the following benefits:
• Security – DynamoDB is secure by default, providing automatic data encryption at rest and in transit.
• Scalability – DynamoDB is highly scalable and can handle requests up to millions per second.
• Performance – DynamoDB offers predictable sub-millisecond latency for reads and writes.
• Integration – DynamoDB has built-in integrations with some of the most popular AWS services, such as AWS Lambda.
Amazon DocumentDB: Amazon DocumentDB is a fully-managed, highly available, and scalable document database. DocumentDB is compatible with MongoDB, which makes it an ideal choice for migrating an existing MongoDB application to the cloud. DocumentDB provides the following benefits:
• Highly Scalable – DocumentDB can horizontally scale up to petabytes of data and millions of reads and writes per second.
• Fully Managed – AWS manages the operational aspects of DocumentDB such as patching, backups, and failover.
• Compatibility with MongoDB – DocumentDB is fully compatible with MongoDB, including the query language, indexes, and drivers. This makes the migration of MongoDB-based applications to DocumentDB a seamless process.
Use Cases for AWS Databases
Organizations choose different databases depending on their specific needs and applications. Below are some common use cases of AWS databases:
Content Management Systems and eCommerce: The ultimate goal of content management systems and eCommerce sites is to provide a personalized experience to the user. For that reason, they require databases that can scale to meet the demands of a growing user base. A good database solution for these applications is Amazon Aurora, which is specifically designed for high availability and performance.
Online Gaming: Online gaming applications require databases that are highly available, fast, and scalable. These requirements can be achieved by using Elasticache and DynamoDB, which are designed for high throughput and low latency.
Analytics and Big Data: Applications that require big data processing capabilities, such as data warehousing and business intelligence, need databases that can scale easily to handle massive amounts of data. Amazon Redshift provides extremely fast data ingestion and querying for large data sets, hence it’s a popular choice for big data processing use cases.
In conclusion, AWS has plenty of database services that are ideal for various use cases. AWS offers fully-managed, flexible and reliable database solutions that enable organizations to manage their data effectively. With AWS, you can easily deploy relational and non-relational databases that deliver the performance, security, scalability, and availability you need. With the variety of database solutions offered by AWS, organizations of all sizes and tech know-how, can choose the best database service for their specific needs and requirements.
2 Artificial Intelligence (AI) Growth Stocks That Are Just Getting Started – Droid Gazzete
MMS • RSS
Posted on mongodb google news. Visit mongodb google news
Looking back, 2023 may well mark a turning point for artificial intelligence (AI). The latest developments in large language models have given birth to next-generation technology, including OpenAI’s ChatGPT — but that’s likely just the beginning. The capabilities of these AI systems have sparked imaginations, and the brightest minds in business are looking for ways to employ this groundbreaking technology to automate repetitive tasks, improve customer service, and create new opportunities.
The ongoing AI revolution has investors searching high and low to profit from the massive potential afforded by this state-of-the-art technology. While estimates vary wildly, one of the most bullish forecasts comes from Cathie Wood’s Ark Investment Management, which suggests that the global AI software market will grow at 42% annually, topping $14 trillion by 2030. Even if this estimate turns out to be overly bullish, it helps show that the market for AI-enabled software could grow at a blistering pace for years to come.
Let’s look at two high-flying stocks that are well positioned to benefit from the AI revolution.
1. HubSpot
HubSpot (HUBS -0.14%) made its fortune by disrupting traditional advertising. It pioneered the concept of inbound marketing, which builds relationships with potential customers with compelling content offered online, via social media, and in blog posts.
The company has since expanded its empire to encompass the entire spectrum of customer relationship management (CRM), with a vast ecosystem of interconnected offerings. These include solutions for marketing, sales, service, content management, and operations teams, with tools that help to manage data, commerce, reporting, automation, content, messaging, and payments.
CEO Yamini Rangan laid out the case for what the latest advances in AI means to HubSpot and its customers in the company’s first-quarter earnings call, saying, “HubSpot is [a] powerful, yet easy to use … all-in-one CRM platform powered by AI.” He noted that the company is integrating generative AI across its offerings, going on to say that the company is differentiated by its “unique data and broad distribution.”
“HubSpot CRM data is unified and cohesive, making it easier for AI to ingest and drive relevance,” Rangan said. Finally, the chief executive points out that HubSpot customers “don’t have to become AI experts to reap the transformational benefits” available on its platform.
HubSpot’s first-quarter results provide a glimpse at its potential. Even in the middle of a challenging economy, revenue grew 27% year over year, while adjusted earnings per share (EPS) of $1.25 more than doubled. The results were fueled by solid customer gains, which grew 23%. Perhaps more importantly is the expanding relationships with existing customers, as 45% of the company’s annual recurring revenue is generated by clients using three or more hubs.
The stock is currently selling for 10 times next year’s sales, so it isn’t cheap in terms of traditional valuation measures. That said, in less than nine years, HubSpot stock has gained more than 1,600% — and is still well off its peak. Given its history of strong growth, is valuation seems much more reasonable.
2. MongoDB
MongoDB (MDB -1.43%) made a name for itself by disrupting the traditional database paradigm. While most databases are limited to rows and columns, MongoDB’s Atlas cloud-native platform can handle this and much more, including video and audio files, social media posts, and even entire documents, providing users with much more robust database solutions. This provides developers with a much greater degree of flexibility to create software applications.
When announcing the company’s first quarter of fiscal 2024 results, CEO Dev Ittycheria explained what the shift to AI means to MongoDB, saying: “We believe the recent breakthroughs in AI represent the next frontier of software development. The move to embed AI in applications requires a broad and sophisticated set of capabilities, while enabling developers to move even faster to create a competitive advantage.” He went on to say the company was “well positioned to benefit from the next wave of AI applications in the years to come.”
MongoDB’s results from the first quarter of fiscal 2024 help tell the tale. Revenue of $368 million grew 29% year over year — even in the face of economic headwinds — while its adjusted EPS of $0.56 soared 180%. Fueling the results was the most net new customer additions in more than two years. The results were led by Atlas, the company’s fully managed, database-as-a-platform service platform, which grew 40% year over year and now makes up 65% of MongoDB’s total revenue.
The stock might seem expensive at 14 times next year’s sales, but consider this: In just over five years, MongoDB stock has gained more than 1,000% — even after its downturn-induced drubbing — so its valuation shouldn’t be viewed in a vacuum.
As new customers seek out platforms offering the greatest capacity to build and run new AI applications, MongoDB’s Atlas is a top choice.
Article originally posted on mongodb google news. Visit mongodb google news
MMS • RSS
Posted on nosqlgooglealerts. Visit nosqlgooglealerts
AWS DynamoDB is a NoSQL database service by AWS that offers scalability and high performance for applications that require low-latency data access. The service was launched in 2012 to compete with other managed NoSQL databases such as MongoDB and Cassandra. DynamoDB is designed to handle large volumes of data and can scale horizontally without any downtime. It is used by many popular companies such as Netflix, Airbnb, and Lyft, to name a few.
Features and Benefits
DynamoDB offers several features that make it a powerful tool when it comes to managing large volumes of data. These features include:
- Scalability: DynamoDB can handle millions of requests per second and can scale horizontally without any downtime. This means that as your application grows, you can easily add more capacity to handle the increased load.
- Availability: DynamoDB offers high availability with automatic failover and multi-region replication. This means that even if a region goes down, your application can continue to operate without any downtime.
- Performance: DynamoDB offers low-latency data access, which makes it an excellent fit for applications that require high performance.
- Security: DynamoDB integrates with other AWS services such as IAM and KMS to provide advanced security features such as encryption at rest and in transit.
- Flexibility: DynamoDB offers a flexible data model that allows you to store structured, semi-structured, and unstructured data.
Data Model
DynamoDB has a flexible data model that allows you to store structured, semi-structured, and unstructured data. In DynamoDB, data is organized into tables, which consist of items and attributes. An item is a set of attributes that represent a single record, while an attribute is a key-value pair that represents a specific data element.
Each table in DynamoDB must have a primary key, which can either be a partition key or a composite key (partition key and sort key). Partition keys are used to partition the data across multiple nodes in the DynamoDB cluster, while sort keys are used to sort the items within a partition.
Querying Data
DynamoDB offers two ways of querying data: Query and Scan.
- Query: Query allows you to retrieve data by specifying a partition key and an optional sort key. Query returns a subset of items that matches the specified key condition expression. Query is a more efficient way of retrieving data compared to Scan, as it retrieves only the items that match the specified key condition expression.
- Scan: Scan allows you to retrieve all the items in a table or a subset of items by specifying a filter expression. Scan reads all the items in a table or a subset and applies the filter expression to return only the items that match the expression. Scan can be inefficient when dealing with large amounts of data since it reads the entire table or a subset of items.
DynamoDB Streams
DynamoDB Streams is a feature that allows you to capture a real-time stream of updates to a table and store them in a separate stream. Each stream consists of a sequence of events that have taken place on the table. Streams can be used to trigger events such as sending notifications or updating other tables in real-time.
Streams can also be used to replicate data across multiple tables or regions. By consuming the stream and writing the updates to other tables, you can ensure that data is replicated across multiple regions or tables in real-time.
DynamoDB Accelerator (DAX)
DynamoDB Accelerator (DAX) is a caching service that sits between your application and DynamoDB to improve the performance of read-intensive applications. DAX caches frequently accessed data in memory, which reduces the number of requests that need to be sent to DynamoDB. This results in lower latency and better application performance.
AWS DynamoDB is a powerful and flexible NoSQL database service that provides scalability, high availability, and low-latency data access. Its ability to store structured, semi-structured, and unstructured data, combined with its scalability, enables you to build applications that can handle large volumes of data with ease. With features such as DynamoDB Streams and DynamoDB Accelerator (DAX), DynamoDB is well-suited for real-time, read-intensive applications. If you’re looking for a NoSQL database that can handle large volumes of data and provides high performance, then AWS DynamoDB is definitely worth considering.