Mobile Monitoring Solutions

Search
Close this search box.

DigitalOcean Launches Managed MongoDB

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

<!–


DigitalOcean Launches Managed MongoDB

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


DigitalOcean Launches Managed MongoDB

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

New offering combines DigitalOcean’s signature simplicity with the technical capabilities of MongoDB’s popular database service

DigitalOcean Holdings, Inc., the cloud for developers, startups and SMBs, today announced DigitalOcean Managed MongoDB, a new, fully-managed database as a service (DBaaS) offering in partnership with MongoDB, Inc., the leading, modern general purpose database platform. Launched at DigitalOcean’s virtual conference deploy, Managed MongoDB helps developers of all skill levels easily spin up MongoDB clusters on DigitalOcean.  It also simplifies database administration by seamlessly managing, scaling, and securing clusters so developers can spend more time building their apps and growing their businesses.

“DigitalOcean’s managed services are a keyway we help eliminate the complexity of the cloud so customers can easily turn their ideas into powerful applications that help change the world,” said Jeff Guy, Chief Operating Officer, DigitalOcean. “Our partnership with MongoDB expands our portfolio of highly curated, managed offerings tailored specifically to the developer, startup and SMB markets. By adding support for MongoDB, we can extend the benefits of this powerful technology to our customers around the world — even those without database experience or expertise.”

Expanding on DigitalOcean’s Managed Databases portfolio, Managed MongoDB automates the experience of managing databases in the cloud. Customers get access to the latest releases of the MongoDB document database and benefit from key features like built-in security and point-in-time recovery, which allows customers to restore to any point within a seven-day backup history window. The solution also provides high availability with automated failovers and gives users the ability to scale up on demand to handle traffic spikes. Additionally, Managed MongoDB follows DigitalOcean’s simple, predictable pricing model to help developers, startups and SMBs confidently scale their production applications.

While databases are a critical component to any application, building, maintaining, and scaling them can be complex and time consuming. Managed MongoDB helps simplify the end-to-end management of MongoDB, the world’s most popular database. As a result, customers can get up and running quickly without having to worry about the set up or administration of their database.

“DigitalOcean and MongoDB are both focused on driving developer productivity and helping more businesses run in the cloud,” said Alan Chhabra, Senior Vice President, Worldwide Partners, MongoDB.  “As part of our commitment to give developers the freedom to run anywhere, we’re excited to partner with DigitalOcean to enable customers to use MongoDB to transform their applications and grow their businesses in even more regions than previously available.”

“DigitalOcean’s certified offering provides a simple, managed way to run MongoDB in the cloud that customers can grow with over time,” said Carl Olofson, Research Vice President, Data Management Software, IDC. “Because this solution was developed in partnership with MongoDB, users benefit from the full power of the database solution while realizing the cost and ease-of-use benefits of DigitalOcean.”

Managed MongoDB is available in beta now in DigitalOcean’s New York, Frankfurt and Amsterdam data center regions. Read more about pricing here: https://www.digitalocean.com/pricing#managed-databases.

About DigitalOcean

DigitalOcean simplifies cloud computing so developers and businesses can spend more time building software that changes the world. With its mission-critical infrastructure and fully managed offerings, DigitalOcean helps developers, startups and small and medium-sized businesses (SMBs) rapidly build, deploy and scale applications to accelerate innovation and increase productivity and agility. DigitalOcean combines the power of simplicity, community, open source, and customer support so customers can spend less time managing their infrastructure and more time building innovative applications that drive business growth. For more information, visit digitalocean.com or follow @digitalocean on Twitter.

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


NoSQL Software Market Size, Trends, Growth, Scope, Overall Analysis and Forecast by 2028

MMS Founder
MMS RSS

Posted on nosqlgooglealerts. Visit nosqlgooglealerts

New Jersey, United States,- The NoSQL Software Market report is a research study of the market along with an analysis of its key segments. The report is created through extensive primary and secondary research. Informative market data is generated through interviews and data surveys by experts and industry specialists. The study is a comprehensive document on key aspects of the markets including trends, segmentation, growth prospects, opportunities, challenges, and competitive analysis.

The report will be updated with the impact of the current evolving COVID-19 pandemic. The pandemic has had a dynamic impact on key market segments, changing the growth pattern and demand in the NoSQL Software market. The report includes an in-depth analysis of these changes and provides an accurate estimate of the market growth as a result of the effects of the pandemic.

The report provides a comprehensive overview of the competitive landscape along with an in-depth analysis of the company profiles, product portfolio, sales, and gross margin estimates, as well as market size and share. Additionally, the report examines the companies’ strategic initiatives to expand their customer base, market size, and generate revenue. In addition, important industry trends, as well as sales and distribution channels, are assessed.

The report covers extensive analysis of the key market players in the market, along with their business overview, expansion plans, and strategies. The key players studied in the report include:

The report offers a comprehensive analysis of the NoSQL Software market insights along with a detailed analysis of the market segments and sub-segments. The report includes sales and revenue analysis of the NoSQL Software industry. In addition, it includes a detailed study of market drivers, growth prospects, market trends, research and development progress, product portfolio and market dynamics.

NoSQL Software Market Segmentation

NoSQL Software Market Report Scope 

Report Attribute Details
Market size available for years 2021 – 2028
Base year considered 2021
Historical data 2015 – 2020
Forecast Period 2021 – 2028
Quantitative units Revenue in USD million and CAGR from 2021 to 2028
Segments Covered Types, Applications, End-Users, and more.
Report Coverage Revenue Forecast, Company Ranking, Competitive Landscape, Growth Factors, and Trends
Regional Scope North America, Europe, Asia Pacific, Latin America, Middle East and Africa
Customization scope Free report customization (equivalent up to 8 analysts working days) with purchase. Addition or alteration to country, regional & segment scope.
Pricing and purchase options Avail of customized purchase options to meet your exact research needs. Explore purchase options

Geographical Analysis of the NoSQL Software Market:

The latest Business Intelligence report analyzes the NoSQL Software market in terms of market size and consumer base in major market regions. The NoSQL Software market can be divided into North America, Asia Pacific, Europe, Latin America, Middle East and Africa based on geography. This section of the report carefully assesses the presence of the NoSQL Software market in key regions. It determines the market share, the market size, the sales contribution, the distribution network and the distribution channels of each regional segment.

Geographic Segment Covered in the Report:

 • North America (USA and Canada)
 • Europe (UK, Germany, France and the rest of Europe)
 • Asia Pacific (China, Japan, India, and the rest of the Asia Pacific region)
 • Latin America (Brazil, Mexico, and the rest of Latin America)
 • Middle East and Africa (GCC and rest of the Middle East and Africa)

Summary of the Report:

  • The report offers a comprehensive assessment of the NoSQL Software market including recent and emerging industry trends.
  • In-depth qualitative and quantitative market analysis to provide accurate industry insight to help readers and investors capitalize on current and emerging market opportunities
  • Comprehensive analysis of the product portfolio, application line and end users to provide readers with an in-depth understanding.
  • In-depth profiling of key industry players and their expansion strategies.

Visualize NoSQL Software Market using Verified Market Intelligence:-

Verified Market Intelligence is our BI-enabled platform for narrative storytelling of this market. VMI offers in-depth forecasted trends and accurate Insights on over 20,000+ emerging & niche markets, helping you make critical revenue-impacting decisions for a brilliant future.

VMI provides a holistic overview and global competitive landscape with respect to Region, Country, and Segment, and Key players of your market. Present your Market Report & findings with an inbuilt presentation feature saving over 70% of your time and resources for Investor, Sales & Marketing, R&D, and Product Development pitches. VMI enables data delivery In Excel and Interactive PDF formats with over 15+ Key Market Indicators for your market.

About Us: Verified Market Research™

Verified Market Research™ is a leading Global Research and Consulting firm that has been providing advanced analytical research solutions, custom consulting and in-depth data analysis for 10+ years to individuals and companies alike that are looking for accurate, reliable and up to date research data and technical consulting. We offer insights into strategic and growth analyses, Data necessary to achieve corporate goals and help make critical revenue decisions.

Our research studies help our clients make superior data-driven decisions, understand market forecast, capitalize on future opportunities and optimize efficiency by working as their partner to deliver accurate and valuable information. The industries we cover span over a large spectrum including Technology, Chemicals, Manufacturing, Energy, Food and Beverages, Automotive, Robotics, Packaging, Construction, Mining & Gas. Etc.

We, at Verified Market Research, assist in understanding holistic market indicating factors and most current and future market trends. Our analysts, with their high expertise in data gathering and governance, utilize industry techniques to collate and examine data at all stages. They are trained to combine modern data collection techniques, superior research methodology, subject expertise and years of collective experience to produce informative and accurate research.

Having serviced over 5000+ clients, we have provided reliable market research services to more than 100 Global Fortune 500 companies such as Amazon, Dell, IBM, Shell, Exxon Mobil, General Electric, Siemens, Microsoft, Sony and Hitachi. We have co-consulted with some of the world’s leading consulting firms like McKinsey & Company, Boston Consulting Group, Bain and Company for custom research and consulting projects for businesses worldwide.

Contact us:

Mr. Edwyne Fernandes

Verified Market Research™

US: +1 (650)-781-4080
UK: +44 (753)-715-0008
APAC: +61 (488)-85-9400
US Toll-Free: +1 (800)-782-1768

Email: [email protected]

Website:- https://www.verifiedmarketresearch.com/

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


What is Data Literacy and How is it Playing a Vital Role in Today’s World?

MMS Founder
MMS RSS

Article originally posted on Data Science Central. Visit Data Science Central

What literacy was for the past century is what data literacy is for the twenty-first century. Most employers now prefer people with demonstrated data abilities over those with higher education, even data science degrees. According to the report, only 21% of businesses in the United States consider a degree when hiring for any position, compared to 64% who look for applicants who can demonstrate their data skills. When data is viewed as a company’s backbone, it’s critical that corporations assist their staff in properly utilizing data.

What is Data Literacy?
The capacity to understand, work with, analyze, and communicate with data is known as data literacy. It’s a talent that requires workers at all levels to ask the right questions of data and machines, creates knowledge, make decisions, and communicate meaning to others. It isn’t only about comprehending data. To be educated, you must also have the confidence to challenge evidence that isn’t performing as it should. Literacy aids the analysis process by allowing for the human element of critique to be considered. Not only for data and analytics professions, but in all occupations, organizations are looking for data literacy. Companies that rigorously invest in data literacy programs will outdo those that don’t.

Why is it Important?
There are various components to achieving data literacy. Tools and technology are important, but employees must also learn how to think about data to understand when it is valuable and when it is not. When employees interact with data, they should be able to view it, manipulate it, and share the results with their colleagues. Many people go to Excel because it is a familiar tool, but confining data to a desktop application is restrictive and leads to inconsistencies. Employees receive conflicting results even though they are looking at the same statistics because information becomes outdated. It’s beneficial to have a single platform for viewing, analyzing, and sharing data. It provides a single source of truth, ensuring that everyone has access to the most up-to-date information. When data is kept and managed centrally, it is also much easier to implement security and governance regulations. Another vital aspect of data culture is having excellent analytical, statistical, and data visualization capabilities. Complex data may be made easy using data visualization, and simple humans can drill through data to find answers to queries.

Should Everyone be Data Literate?
A prevalent misconception regarding data literacy is that only data scientists should devote time to it; instead, these skills should be developed by all employees. According to a Gartner Annual Chief Data Officer (CDO) Survey, poor data literacy is one of the main roadblocks to the CDO’s success and a company’s ability to grow. To combat this, 80% of organizations will have specific initiatives to overcome their employees’ data deficiencies by 2020, Gartner predicts. Companies with teams that are literate in data and its methodologies can keep up with new trends and technologies, stay relevant, and leverage this skill as a competitive advantage, in addition to financial benefits.

How to Build Data Literacy
1. Determine your company’s existing data literacy level.
Determine your organization’s current data literacy. Is it possible for your managers to propose new projects based on data? How many individuals nowadays genuinely make decisions based on data?

2. Identify data speakers who are fluent in the language and data gaps.
You’ll need “translators” who can bridge the gap and mediate between data analysts and business groups, in addition to data analysts who can speak naturally about data. Identify any communication barriers that are preventing data from being used to its full potential in the business.

3. Explain why data literacy is so important.
Those who grasp the “why” behind efforts are more willing to support the necessary data literacy training. Make sure to explain why data literacy is so important to your company’s success.

4. Ensure data accessibility.
It’s critical to have a system in place that allows everyone to access, manipulate, analyze, and exchange data. This stage may entail locating technology, such as a data visualization or management dashboard, that will make this process easier.

5. Begin small when developing a data literacy program.
Don’t go overboard by conducting a data literacy program for everyone at the same time. Begin with one business unit at a time, using data to identify “lost opportunities.” What you learn from your pilot program can be used to improve the program in the future. Make your data literacy workshop enjoyable and engaging. Also, don’t forget that data training doesn’t have to be tedious!

6. Set a good example.
Leaders in your organization should make data insights a priority in their own work to demonstrate to the rest of the organization how important it is for your team to use data to make decisions and support everyday operations. Insist that any new product or service proposals be accompanied by relevant data and analytics to back up their claims. This reliance on data will eventually result in a data-first culture.

So, how is your organization approaching data literacy? Is it one of the strategic priorities? Is there a plan to get a Chief Data Officer? Feel free to share your thoughts in the comments section below.

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


LinkedIn Open Sources Greykite

MMS Founder
MMS Olimpiu Pop

Article originally posted on InfoQ. Visit InfoQ

LinkedIn open sourced Greykite, a Python library that promises to provide accurate future forecasts in an interpretable manner, allowing visualizations of the trend, seasonality, and other effects. Built to be flexible, intuitive and fast, the LinkedIn team demonstrated that it performed 4 times better than FB’s prophet providing more accurate results for 1 day and 7 day forecasts.

Written in Python, it provides mechanisms that can be used for short and long term forecasting. Due to its fast, accurate and intuitive nature; Silverkite, the library’s main algorithm is suitable for interactive and automated forecasting at scale. Time series forecasts can provide future expectations for metrics and other quantities that are measurable over time.

These models allow business to optimise and better prepare for the future from any perspective.

For instance, in LinkedIn’s case it was used for resource planning, performance management, optimization and ecosystem insight generation. More concretely a couple of scenarios in which it is used at LinkedIn:

  1. To provision sufficient infrastructure to handle peak traffic.
  2. To set business metric targets and track progress for operational success.
  3. To optimize budget decisions by forecasting growth of various markets.
  4. To understand which countries are recovering faster or slower after a shock like the COVID-19 pandemic.

With the help of forecasts, LinkedIn’s site reliability engineering (SRE) team ensures site availability in a cost effective manner: they forecast peak minute-level QPS (queries per second) and service service QPS for the next year in order to provision sufficient capacity without excessive buffers and costs. More accurate insights regarding future traffic, corroborated with careful site capacity measurements, enables confident decision making. Every minor cost saving translates into a reduction of total cost, precise forecasts having a big business impact.

Applications of forecasting can be found also with LinkedIn’s Marketing Solutions, where short term forecasts of budgets, clicks, revenue and other metrics feed into a health dashboard that helps pointing potential issues. The forecasts indicate any deviation, providing also context about which metric dimension or related metric may help explain anomalies. Long term forecasts allow metric targets setting and routine checks to ensure that they are on track to meet them.

The output is interpretable, allowing visualizations of the trend, seasonality, and other effects, along with their statistical significance. The Silverkite algorithm works well on time series with (potentially time-varying) trends and seasonality, repeated events/holidays, and/or short-range effects. At LinkedIn, it was successfully applied to a wide variety of metrics in different time frequencies (hourly, daily, weekly, etc.), as well as various forecast horizons, e.g., 1 day ahead (short-term) or 1 year ahead (long-term).

Some key benefits:

  • Flexible: provides time series regressors for trend, seasonality, holidays, changepoints, and autoregression.
  • Intuitive: provides exploratory plots, templates for tuning, and explainable forecasts with clear assumptions.
  • Fast: allows for quick prototyping and deployment at scale.

A benchmark conducted by the Greykite’s development team, concluded that Silverkite’s out of the box configuration performs better for 1-day and 7-day forecast horizons in comparison with Auto-Arima and Prophet. In terms of average runtime, both Greykite and Auto-Arima performed 4 times faster than Prophet (as can be seen in the next table published by LinkedIn)

Besides Silverkite, Greykite also supports Facebook Prophet and plans are to enable other open source algorithms in the future.

LinkedIn’s open sourcing of Greykite provides a tool for anybody that wants to be better prepared for the future. This continues the series of tools released to date: Dagli, a ML library for Java; Lift, a library for measuring AI models fairness; GDMix, a framework for training AI personalization models; Ambry, an object store for media files and others. Greykite is available on GitHub and PyPI.

Greykite promises to provide accurate future forecasts, both on the short term and long term horizons.

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


Digital delinquent deletes developer's database during disastrous Docker deployment, defaults …

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

NewsBlur, an RSS news reading app for the web and mobile devices, recently had one of its databases deleted thanks to an insecure default setting that has dogged developers using Docker since 2014.

In a blog post this week, Samuel Clay, founder of NewsBlur, recounted how an unknown vandal deleted a database from his app’s dockerized MongoDB cluster using a “Docker footgun” – something setup in a way that promotes shooting oneself in the foot, so to speak.

The incident happened as Clay was in the process of moving NewsBlur, which relies on PostgreSQL, MongoDB, Redis, Elasticsearch databases currently, to Docker containers in preparation for a redesign. He switched the app’s MongoDB cluster over to the new servers and shut down the original server, intending to delete it after the new setup proved stable.

Clay explains that the Uncomplicated Firewall (uwf) he enabled on his internal servers didn’t work as expected on a new server because of an insecure Docker default.

“When I containerized MongoDB, Docker helpfully inserted an allow rule into iptables, opening up MongoDB to the world,” he explained. “So while my firewall was ‘active,’ doing a sudo iptables -L | grep 27017 showed that MongoDB was open to the world.”

The exposed database appears to have been spotted and deleted by an automated ransomware script after about three hours. Clay said he was alerted to the disaster when he received an error message from NewsBlur on his phone. It included “drop” in the error message, the SQL data definition language command for deleting databases.

Upon examining his MongoDB installation, he found a new empty database named “READ__ME_TO_RECOVER_YOUR_DATA” that included a demand for 0.03 BTC (~US$1,094).

Clay had no reason to pay the ransom, however, because he determined no data had actually been stolen and he had a backup of the erased database.

Looking through his MongoDB access logs, he was able to spot two connections that occurred right before the deletion that came from a Tor exit node. While some site owners block IP addresses associated with Tor exit nodes, Clay said NewsBlur has not done so to allow people in internet-censored countries to bypass content restrictions and to promote free speech.

The Docker footgun – installing Docker on Ubuntu Linux that silently bypasses firewall rules – has been a matter of concern among developers for the past seven years. The problem is widely known enough that various online posts offer workaround advice.

Lack of secure defaults is also an issue for various databases. Last year, several thousand inadequately secured databases were deleted in what’s been referred to as a “Meow” attack.

The Register asked Docker why it hasn’t implemented a more secure default but we’ve not heard back. Docker’s documentation does warn that it manipulates iptables, the command line utility is used for IP packet filtering rule configuration.

We put the same question to Clay, who responded, “Your guess is as good as mine. It’s sort of like the trade off between convenience and security. Here convenience won out.” ®

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


MongoDB Announces Proposed Public Offering of Class A Common Stock

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

NEW YORK, June 29, 2021 /PRNewswire/ — MongoDB, Inc. (“MongoDB”) (Nasdaq: MDB), the leading, modern general purpose database platform, today announced that it has commenced an underwritten public offering of 2,300,000 shares of its Class A common stock. All of the shares in the proposed offering will be sold by MongoDB. The proposed offering is subject to market and other conditions, and there can be no assurance as to whether or when the proposed offering may be completed, or as to the actual size or terms of the offering.

MongoDBMongoDB

MongoDB

MongoDB expects to use the net proceeds from the offering for general corporate purposes.

Morgan Stanley and Goldman Sachs & Co. LLC are acting as lead book-running managers for the proposed offering. Siebert Williams Shank, Guzman & Company, Tigress Financial Partners and Drexel Hamilton are acting as co-managers for the proposed offering.

The proposed public offering is being made pursuant to a shelf registration statement on Form S-3 that was filed by MongoDB with the U.S. Securities and Exchange Commission (the “SEC”) and automatically became effective upon filing. A preliminary prospectus supplement and accompanying prospectus relating to and describing the terms of the offering have been filed with the SEC and are available on the SEC’s website at www.sec.gov. Copies of the final prospectus supplement and accompanying prospectus, when available, may be obtained from Morgan Stanley & Co. LLC, Attention: Prospectus Department, 180 Varick Street, 2nd Floor, New York, NY 10014; or Goldman Sachs & Co. LLC, Prospectus Department, 200 West Street, New York, NY 10282, telephone: 1-866-471-2526, facsimile: 212-902-9316 or by emailing Prospectus-ny@ny.email.gs.com.

This press release shall not constitute an offer to sell or the solicitation of an offer to buy, nor shall there be any sale of these securities in any state or jurisdiction in which such offer, solicitation or sale would be unlawful prior to registration or qualification under the securities laws of any such state or jurisdiction.

About MongoDB

MongoDB is the leading modern, general purpose database platform, designed to unleash the power of software and data for developers and the applications they build. Headquartered in New York, MongoDB has more than 26,800 customers in over 100 countries. The MongoDB database platform has been downloaded over 175 million times and there have been more than 1.5 million registrations for MongoDB University courses.

Forward-Looking Statements

This press release contains forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Words such as “may,” “will,” “expect,” “plan,” “anticipate,” “estimate,” “intend” and similar expressions (as well as other words or expressions referencing future events, conditions or circumstances) are intended to identify forward-looking statements. Forward-looking statements contained in this press release include statements relating to MongoDB’s expectations regarding the completion, timing and size of the proposed public offering, and MongoDB’s planned use of the proceeds from the proposed public offering. These forward-looking statements are based on MongoDB’s expectations and assumptions as of the date of this press release. Actual results may differ materially from these forward-looking statements. Each of these forward-looking statements involves risks and uncertainties. These risks and uncertainties include, without limitation, risks and uncertainties related to market conditions and satisfaction of customary closing conditions related to the proposed public offering. There can be no assurance that MongoDB will be able to complete the offering on the anticipated terms, or at all. Other factors that may cause actual results to differ from those expressed or implied in the forward-looking statements in this press release are discussed in MongoDB’s filings with the SEC, including under the heading “Risk Factors” contained therein, as well as the risks identified in the registration statement and the preliminary prospectus supplement relating to the offering. Except as required by law, MongoDB assumes no obligation to update any forward-looking statements contained herein to reflect any change in expectations, even as new information becomes available.

Contacts

Investor Relations Contact:
Brian Denyeau
ICR for MongoDB
646-277-1251
ir@mongodb.com

Media Relations Contact:
Matt Trocchio
MongoDB
communications@mongodb.com

CisionCision

Cision

View original content to download multimedia:https://www.prnewswire.com/news-releases/mongodb-announces-proposed-public-offering-of-class-a-common-stock-301322511.html

SOURCE MongoDB, Inc.

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


DSC Weekly Digest 29 June 2021

MMS Founder
MMS RSS

Article originally posted on Data Science Central. Visit Data Science Central

Dynamic User Interfaces


Become A Data Science Leader

Heatwave

Seattle, Washington, where I live, cracked 112°F at the peak of a heatwave that moved through the region this week. For this time of year, the average high temperature is about 65°F, with a standard deviation of 11°F. This means that the heatwave was a 4.2 sigma event ((112 – 65)/11). An event like this should occur about one day in about 128 years, and while this shattered the record books, Seattle has seen more 100+ days in the last decade than it has since weather records began in the 1880s.

While such heat is certainly no stranger to Texas, this week saw another record – the largest hailstone to hit the state. At 20in x 6in x 6in, had it hit a person, it would have killed them. In Miami, an apartment complex collapsed, after having sunk in ground that was becoming increasingly swamplike from ocean water undermining it.

The world has seen more category 5 storms in the last five years than it had in the last fifty prior, and in the Arctic 90+°F temperatures were recorded at several stations, even as the Arctic Ocean has become completely circumnavigable for much of the year.

The meteorology and climatology fields employ as many data scientists as healthcare, and the models that they create are some of the largest and most complex in the world. Not surprisingly, insurance companies (also big data science employers) work very closely with meteorologists, as weather trends impact health (including pandemics), agriculture, real estate, and sectors of the economy. Those insurance companies are becoming increasingly vocal about climate change because they are having to bear the costs of the growing risks from heat, severe storms, drought, and so forth, costs that are making their way into the goods and services that consumers pay for.

What makes this an area that all data scientists should be watching closely (and gaining more domain expertise in) is that as we refine our models about what is happening right now, what emerges is that not only is anthropogenic-induced global warming real, but the impacts that the models are predicting are becoming more dire, not less. If it can hit 112°F in Seattle, temperatures can hit 140°F in Florida and Arizona (and Mumbai and Dubai), the temperature at which the human body starts shutting down because the brain is becoming cooked. The weather is now entering into our modeling in ways that we never expected.

The last few years have seen a significant portion of the population sticking their heads in the (increasingly hot) sand, trying to deny the potential dangers of climate change, but that’s a narrative that is becoming harder and harder to justify when Seattle streets are melting, Florida apartments are collapsing and the insurance costs are making business as usual too expensive to continue. It is time for data scientists to speak out as well, even when the analysis they bring is one that business and social leaders are finding an inconvenient truth.

In media res,
Kurt Cagle
Community Editor,
Data Science Central

Data Science Central Editorial Calendar

DSC is looking for editorial content specifically in these areas for July, with these topics having higher priority than other incoming articles.

  • MLOps and DataOps
  • Machine Learning and IoT
  • Data Modeling and Graphs
  • AI-Enabled Hardware (GPUs and similar tools)
  • Javascript and AI
  • GANs and Simulations
  • ML in Weather Forecasting
  • UI, UX and AI
  • Jupyter Notebooks
  • No-Code Development

DSC Featured Articles


TechTarget Articles

Picture of the Week

 


To make sure you keep getting these emails, please add mail@newsletter.datasciencecentral.com to your browser’s address book.

This email, and all related content, is published by Data Science Central, a division of TechTarget, Inc.

275 Grove Street, Newton, Massachusetts, 02466 US


You are receiving this email because you are a member of TechTarget. When you access content from this email, your information may be shared with the sponsors or future sponsors of that content and with our Partners, see up-to-date  Partners List  below, as described in our  Privacy Policy . For additional assistance, please contact:  webmaster@techtarget.com


copyright 2021 TechTarget, Inc. all rights reserved. Designated trademarks, brands, logos and service marks are the property of their respective owners.

Privacy Policy  |  Partners List

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


Mongodb Inc (MDB) COO and CFO Michael Lawrence Gordon Sold $2.9 million of Shares

MMS Founder
MMS RSS

Posted on mongodb google news. Visit mongodb google news

Motley Fool

Dump FAANG and Buy These 3 Cheaper Growth Stocks Instead

Buying shares of top tech stocks Facebook, Apple, Amazon, Netflix, and Alphabet (Google) — otherwise known as FAANG — would have earned great returns in the past. If you are looking to make the most of your investment dollars, you should consider buying shares of Pfizer (NYSE: PFE), Oracle (NYSE: ORCL), and UPS (NYSE: UPS) instead of FAANG.

Article originally posted on mongodb google news. Visit mongodb google news

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


Google Open-sources Fully Homomorphic Encryption Transpiler

MMS Founder
MMS Sergio De Simone

Article originally posted on InfoQ. Visit InfoQ

Google has open-sourced a general-purpose transpiler able to convert high-level code to be used with Fully Homomorphic Encryption (FHE).

While FHE is attracting a lot of interest by several companies, including IBM and Microsoft, Google is attempting here a novel approach by creating a transpiler to transform a program written in a high-level language and working with non-encrypted data into an FHE-ready version.

Google’s transpiler will enable developers to write code for any type of basic computation such as simple string processing or math, and run it on encrypted data.

Google’s transpiler has two major components. On the one hand, it uses Google’s open-source XLS SDK to leverage its compilation pipeline and convert higher-level language operations into lower-level boolean operations as required by FHE.

XLS implements a High Level Synthesis (HLS) toolchain which produces synthesizable designs from flexible, high-level descriptions of functionality. It is fully Open Source: Apache 2 licensed and developed via GitHub. XLS is used inside of Google for generating feed-forward pipelines from “building block” routines / libraries that can be easily retargeted, reused, and composed in a latency-insensitive manner.

On the other hand, it uses Google’s TFHE fully homomorphic encryption library to go from the intermediate representation provided by XLS to an HFE computation.

TFHE is a C/C++ library which implements a very fast gate-by-gate bootstrapping […]. The library allows to evaluate an arbitrary boolean circuit composed of binary gates, over encrypted data, without revealing any information on the data.

This modular design has a number of advantages, according to Google. First, a number of different high-level languages are supported out-of-the-box thanks to XLS. At the moment, XLS supports C++ and DSLX a DSL that mimics Rust. Likewise, the output FHE-ready code can be in any language with an FHE library that exposes logical gates as part of its API.

It should be noted that XLS does not fully support all C++ features. In particular, variable-length arrays, while-loops and for-loops with a variable end condition, and floating point data are not supported. Additionally, both XLS and TFHE are still in an experimental stage and bound to change significantly.

Homomorphic Encryption is an approach to secure computation that does not require decrypting your data in order to process them. Instead, homomorphic encryption enables processing ciphertexts with the guarantee that encrypted results match those that would be produced by first decrypting input data, processing them, and finally encrypting them. Among its applications, FHE could be used to train machine learning models on sensitive data, says Google.

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.