Mobile Monitoring Solutions

Search
Close this search box.

Article: InfoQ 2020 Recap, Editor Recommendations, and Best Content of the Year

MMS Founder
MMS RSS

Article originally posted on InfoQ. Visit InfoQ

As 2020 is coming to an end, we created this article listing some of the best posts published this year. This collection was hand-picked by nine InfoQ Editors recommending the greatest posts in their domain. It’s a great piece to make sure you don’t miss out on some of the InfoQ’s best content.

By Leandro Guimarães, Arthur Casals, Charles Humble, Daniel Bryant, Johan Janssen, Manuel Pais, Renato Losio, Shane Hastie, Steef-Jan Wiggers, Thomas Betts

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


4 Key Principles of Data Collection

MMS Founder
MMS RSS

Article originally posted on Data Science Central. Visit Data Science Central

In today’s data-driven world, companies across the globe are plunging into the humungous pool of online customer information that is made available through different modes of data collection. More than 2.5 quintillion bytes of data are created each day, and this number just keeps on growing. Moreover, it was found that 67% of customers were willing to give their data for some benefits in terms of discounts. 

Online customer data like social media activities, search engine tracking, and purchase history are being generated on a regular basis. This data gives eCommerce companies a peek into customer demographics, preferences & interests thus creating a customer profile useful for targeting specific products based on customer needs, just at the right time.  Hence, data collection should be a priority for these companies to create customized marketing strategies.

“Analytical insights from data are only as useful as the quality of data collected.”

This brings us to the simple truth that accurate data collection plays a pivotal role in creating prolific marketing strategies yielding results for eCommerce companies.

Here are four reasons why collecting accurate customer data is useful for eCommerce companies:

  1. Promote relevant products through multiple communication channels like email, SMS, push & pull notifications, etc.
  2. Better customer segmentation by accurately dividing customers based on their behavior, preferences, and interests.
  3. Developing customized marketing campaigns to give a personal touch using information such as birthday, locality, age group, etc.
  4. Increasing customer lifetime value by predicting customer requirements based on purchase history information.

To ensure these strategies fall through as desired, you need to follow the basic principles of data collection. 

“The basic principles of data collection include keeping things as simple as possible; planning the entire process of data selection, collection, analysis, and use from the start; and ensuring that any data collected is validreliable, and credible. It is also important that ethical issues are considered.”

The process and principles of data collection have been around for a long time; remember creating questionnaires and asking customers to fill them? Be it offline or online data collection, principles applied to remain the same. 

Let’s have a look at the four principles of data collection one should follow from an online data collection perspective.

Principle 1: Identify the data you need to collect (Get the right data) 

This principle simply states that the data needed for analysis should be fixed from the beginning of the data collection process. Do not collect data without figuring out how it will be analyzed or be useful. In such scenarios, it is highly likely you will end up with loads of data that is not useful. For e.g. auto part dealers need information about customer’s car models whereas fashion store owners need information about the customer’s age.

How to go about it?

The customer data you need depends on the type of business you run and the products you are selling. Hence, it is critical to understand what sort of data you need. Businesses usually collect the following type of data based on their requirement:

  • Behavioral data

This data tells you how customers behave when they interact with your company. This data is measurable, analyzing it helps you improve your customer’s interaction with your company & increase conversion rates

Data types: Transactional data, communication data, online activity, customer relations

  • Identity data

This data usually consists of customer demographics thereby helping you create an ideal customer profile. Thus, playing a vital role in forming customized marketing strategies.

Data types: Name, email, phone number, DOB, gender, birth date, social media profile, company’s name, position

  • Descriptive data

This data consists of personal information apart from identity data. This data helps you understand the customer at a personal level and create appropriate strategies aimed at offering additional products and services e.g. a person buying a new car and being interested in music can help you cross-sell car speakers.

Data types: marital status, number of kids, lifestyle, hobbies, pets

  • Qualitative data

This type of data is usually acquired through surveys. They include customer preferences, desirability, and sentiments. This data provides insight into the customer’s perception of your company.

Data types: Any kind of customer survey.

Principle 2: Authenticate your source of data (Get data the right way)

Getting data the right way encompasses the source of data. If the source is flawed, the data is meant to give you insignificant or irrelevant analytical insights, ultimately leading to a flawed marketing strategy. Hence, you need to choose the source of the required data very wisely. 

How to go about it?

Here are a few sources you may need:

  • Website analytical tools

If you are looking to analyze the behavioral data of your customer, a web analytic tool is an answer. Analyzing parameters such as bounce rates, page views, etc. can give you better insights into your customer’s interests.

  • Online surveys

The requirement of consent and approval of the participant makes it the most authentic source of data collection. It gives you an in-depth analysis of customers’ opinions and sentiments. Ideal source for qualitative and descriptive data

  • Customer interviews & feedback

It can be done through post-purchase feedback forms, call or video call interviews, or focus group meetings. It gives critical insight into customer’s preferences and opinions. It is ideal for identity, descriptive and qualitative data.

  • Social media engagements

Customers present their unbiased views and preferences on social media platforms. Social listening can help you identify and analyze consumer’s interests, opinions, and experiences. It is ideal for identity and qualitative data.

Based on the data that you need to gather, a mixed-use of all these sources can help you create a successful marketing strategy.

Principle 3: Validate the data you collect for errors and reduce them (Get the data right)

No matter how authentic the source, there is still a minor possibility of errors based on the type of tools used for data collection. As the success of your marketing efforts depends on accuracy, following a proper data validation process is essential. For e.g. a customer marks his preference for a red shoe over a blue shoe in an online survey but social media listening may suggest that their favorite color is blue when it comes to outfits.

How to go about it?

Especially, when it comes to identity data like name, email, phone number, etc. accuracy and completeness are highly critical, and hence, validation is non-negotiable. Data validation and enrichment call for the implementation of data cleansing techniques.

Principle 4: Get current and updated data (Get the data right away)

Data degrades over time and leads to inaccurate and incorrect insights. Data such as email, phone number, company, position, and addresses of customers change over time. Even preferences and interests are associated with factors such as age, marital status, etc. hence bound to change. At this juncture, it is essential to ensure that the data collected is current and updated.  

How to go about it?

Select data collection techniques and sources that can provide you with the latest information on your consumers. For e.g. government records have a high probability of containing obsolete data whereas customer interviews provide you with the latest information. Avoiding data degradation can also be done with the help of thorough cleansing and efficient data management.

Conclusion

Today, the data-oriented approach has left us with a massive amount of information at our disposal. Hence, the need to select the right data to collect, from an authentic source, with high accuracy is indispensable. As a business person or marketer, it is your prime responsibility to facilitate high-quality data collection, gain critical insights, and create impeccable marketing strategies customized according to the needs and wants of the customer.  

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


Airbnb Showkase: a Browser for Your JetPack Compose Library

MMS Founder
MMS Sergio De Simone

Article originally posted on InfoQ. Visit InfoQ

Airbnb Showkase aims to help developers organize, discover, and visualize their Jetpack Compose UI elements by synthesizing a browser activity based on specific code annotations.

By Sergio De Simone

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


Blockchain emerging as Next-Generation Data and Model Governance Framework

MMS Founder
MMS RSS

Article originally posted on Data Science Central. Visit Data Science Central

Introduction and Motivation

The blockchain technology has led to a strong foundation for different applications related to asset management, medical/health, finance, and insurance. Data analytics provided by the blockchain network helps efficient data management, analysis, privacy, quality assurance, access, and integration in heterogeneous environments.
The role of blockchain in data privacy is evidently becoming more strong when the current breakthroughs in quantum computing render present encryption technologies ineffective and make them susceptible to brute-force attacks. As the volume of data that blockchain networks store is also rapidly increasing over time, let’s explore how blockchain technology can play a dominant role in Data Governance.
Its ability to store failure transactions find the huge potential of bitcoin in financial and banking services.

In this blog, let’s explore the key functionalities of blockchain to serve as a data governance framework and see it with an example using Google Cloud Platform

The blockchain architecture promotes a data store with a range of governance functionalities in times when regulatory mechanisms like GDPR has come to function. It has been known as a unique data processing platform that removes the need for a centralized authority. Moreover, as an append-only, permanent data storage, it comes with characteristic features of ensuring high-quality data across organizational units, data security, consistency, and manage regulatory risks that are discussed below:
Transparency – Data stored on a blockchain are accessible to all participants who have internet.
Immutability –This feature comes with the distributed consensus process which helps in the public audit train process. As all transactions in the blockchain network are stored as immutable records, it ensures durability where nothing can be deleted or modified.
Consistency – The data stored with distributed consensus protocol helps in single truth across the blockchain network, ensuring its consistency which is one of the essential characteristics of any governance framework.
Equal rights – The disintermediation feature allows every participant of the network has the same rights to manipulate and access the blockchain. The access rights are governed by the computation power or stake owned by the participant, dependent on the consensus protocol.
Availability – The nodes in the blockchain network allows every participant to preserve a full replica of the blockchain data, which remain available as long as the nodes remain available.
Data Provenance – It serves as a historic document, storing all transactions and data lineage. The history of data needs and migrations are all tracked and each change tied to its author. In addition, any rollbacks to the previous state are accounted for. The framework further provides access to data lineage where the data needs to be moved and operated upon across organizational lines.
Traceability – Audit trail and data lineage add to the traceability of data.
Compliance – Blockchain provides sufficient attributes to follow standards, conventions, or regulations and similar rules relating to data quality.
Confidentiality – The framework allows access to authorized users, thereby providing confidentiality.
Credibility – The frameworks provide credibility which is taken as true and believable by users.
Actually, a bitcoin network resembles a database more in terms of providing Data Governance functionalities by embedding the sequence of steps of data transition within itself. As illustrated in the figure below, the bitcoin network follows a database in terms of preserving consistency, validity with the complete state, history, and governing expectations that can be shared amongst multiple stakeholders and can be operated on independently.

Applications that utilize blockchain technology are:

Currency (e.g., Bitcoin and micro-payments), Contracts (e.g., escrow and automated insurance process based on agreed terms), Asset management tools (e.g., land registry and digital coupons)
With the need for Data Governance in crypto-currencies particularly for transaction fees (e.g., smart contract executions), tokenized assets (value or equity) organizations using blockchain as a Data Governance framework, should allow the content of a smart contract to be GDPR compliant. In addition, the public key of a blockchain account (Personally Identifiable Information(PII)) should be protected using any of :
Mixing keys, especially in UTXO-based blockchains), Value transfer (e.g., zero-knowledge proofs/arguments), and Blind signatures and data payloads(e.g., encryption and read permissions as assets)
While several studies have addressed a range of governance issues of data on blockchains, there is a lack of a comprehensive approach to deal with those issues and effectively orchestrate data management processes. It denotes that there is a need for a novel governance framework for both a blockchain platform and a blockchain-based application

Blockchain As Data Governance Accelerator

The blockchain serves global financial services not limited to insurance, and investments, but also through its cost-competitive, operational models is able to accelerate and enhance the risk-management functions of the businesses.
  • MDM and blockchain can reap benefits from mutual integration.
  • MDM can utilize blockchain for Data Distribution and Data Governance, leaving access to great master data to blockchain technology.
Data Governance with blockchain works as a unit to offer insurance customers more secure products. For instance, Coinbase, a large Bitcoin wallet, is known for protecting people by providing insurance against employee theft and hacking.
Along with MDM, the foremost capabilities of Blockchain lies in building an increased trust ecosystem in terms of a Data Governance Framework.
  • Access to third-party controls, with enhanced risk management Decentralized ledger systems with full auditing features
  • Timely alerts and notifications informing changes
  • Acts as a guarantor for the integrity of a digital representation of a physical entity.
  • Acts as a Data Chain of Custody
  • Helps in the reinterpretation of Events
  • Source of Truth and Doubt
  • Ability to create blockchain-based data escrow by a combination of cryptographic techniques like secret sharing with smart contracts, where the encrypted and published data is available to a critical number of stakeholders.
One of the use of blockchain in the supply chain industry as DG is where certificates are given to food products to ensure their authenticity. The metadata about the certificates could be stored on a blockchain, and a buyer can verify the purchased product through verifying the certificates with the metadata stored on the blockchain.

BlockChain Data Governance with GCP

The following figure illustrates a conventional/traditional Data Governance Architecture with Google Cloud.
The Data Catalog API supports the ingestion of technical metadata from non-Google Cloud data assets as well. 
In addition, its integration with Cloud Data Loss Prevention (Cloud DLP) enables users to run Cloud DLP inspection jobs on BigQuery. This, in turn, helps to automatically create Data Catalog tags for identifying PII data.
Hybrid Blockchains
Blockchain.com’s tech team began hosting some of its IT infrastructure within GCP Compute Engine instances and added Google Cloud Platform’s Managed Services:
GCP makes it easy to get the basics of security right. Google Cloud goes above and beyond to protect data, infrastructure, and services from external threats, while internally, the permission model integrated with Google Workspace gives granular control over access rights.
Public blockchain data are freely available in BigQuery through the Google Cloud Public Datasets Program for eight different cryptocurrencies which are referred to here as Google’s crypto public datasets.
Cloud Spanner – Spanner Server services allow fast scaling (with no downtime), provide high-availabilitystrong consistency with low operational overhead by leveraging the globally distributed databases. The cost-effective solution helps to ingest raw blocks from Ethereum nodes in real-time, transform that data, and persist in Google Cloud Spanner. It is also capable of restoring huge databases in just hours.
Cloud Identity Access Management [Cloud IAM] and VPC firewall allow Blockchain to lock down access to resources according to the least privilege principle and implement defense in depth.
Stackdriver – It’s logging and monitoring capability enables us to be alerted to any unusual activities in real-time.
Google’s Cloud Identity-Aware Proxy (Cloud IAP) for user identity verification purposes, within the customer-facing part of its platform, and also within its back-office application environments.
Authentication – Easy authentication mechanism allowing to activate applications based on G Suite/Google WorkSpace accounts.
Further, watch this youtube video to build a “Blockchain on GCP using Hyperledger Fabric and Composer“.

Blockchain limitations

  • Latency in establishing consensus.
  • Sensitive data needs to be forgotten.
  • Due to high storage volume, features like auto-deletion beyond a certain volume of data has to be incorporated. However, they make other blockchain functions complex.
  • Network governance could break down.
  • Monopoly or single organizations controlling data ecosystem (generation, access, and regulation of all data, a blockchain) limits its value. Hence it should encourage the involvement and participation of peers in the blockchain networks.
  • Reading Blockchain transactions involves reading receipt-based transient synchronous communication, that does not directly return results or indicate whether the transaction was successful.

Future Work & Unanswered Questions

Blockchain technology looks promising for next-generation Data Governance Framework with the following enhanced business functionalities:
  • Better Decision Making with consistency, completeness, and accuracy
  • Operational Efficiencies with fact-based decisions that become real-time events.
  • Improved data understanding and lineage (removal of confusion, adding clarity and meaning)
  • Data alignment leading to Regulatory Compliance
  • Increased Revenue obtained by added data confidence and insights sharing.
In spite, there remains much work/research need to be done for
  • Strategizing end to end Machine Learning pipelines with Online and Batch/Stream
  • Processing events with Blockchain
  • Can we integrate AI/ML algorithms with blockchain events (live crypto-currency feed) along with events from disparate data sources ( say Iot)
  • How do we ensure the privacy and decentralization of non-blockchain events in the same pipeline?
  • In that do, we plan to add Data Catalog and DLP in the architecture.
  • How do we assemble and ensure fair data for blockchain (cryptocurrency exchanges) events and build fair ML models?

Role of BlockChain in Machine Learning Model Governance

Note : Blockchain for Data and Model Governance (US20200082302A1, Application US16/128,359)
There are key processes in building ML models and helping others in the team/organization to understand the same. and collaborate with each other. Some of the crucial requirements to fulfill model understanding and feature sharing across teams within the organization include:
  • Features causing bias, model sensitivity, or target leaks.
  • Steps to build the model, datasets used for training, validation, and testing.
  • Model interpretability explaining the factors for the Model’s behavior.
  • Model explainability and accountability for all use-cases that could alleviate risk and prevent time and effort in Model re-development.
  • How can blockchain help to codify accountability? 
To know more about it, check out Going Beyond Bitcoin: Critical Applications for Blockchain in AI and its associated patent, whether the inventor describes techniques to codify analytically and granular steps for machine learning model development using blockchain technology. This helps to associate the chain of entities, work tasks, and requirements with a model, including testing and validation checks.
In addition, the model development process on the blockchain helps in the governance process by:
  • By providing the analytic model its own entity, life, structure, and description, with detailed structure and documentation Helping to create Models with more explainability and less bias with an increased focus to deliver Ethical and Explainable AI technology solutions.
  • Increasing the scope for future work by creating essential assets for the organization.
  • Facilitating analytic tracking document  (ATD) and agile model development process which can be used by parties outside the development organization. It also helps regulatory bodies in Model audits.

Conclusion

Blockchain techniques are uniquely suited to data governance systems. 
Blockchain Technology comes in-built with Data Governance due to its capability of maintaining history can honor the privacy and the right usage. In general, blockchain networks are particularly more suited among cooperating peer organizations or business units cooperative among each other in a mutually beneficial manner, or even when with aforesaid regulations controlling movement or sharing data among each other.
The role of Blockchain in Model Governance can transform the entire AI/ML pipeline with not only model-level information and cause-effect relations but also with accounting details for data scientists, big data, analytics, QA professionals for each change, transformation, re-training, validation, testing that have been undertaken at each sprint within each story.

References

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


How Cyber Attacks Have Affected Businesses in 2020

MMS Founder
MMS RSS

Article originally posted on Data Science Central. Visit Data Science Central

Cyber-attacks have been bothering people lately; however, 2020 shows us that there are no limits for people who are behind these crimes. Hackers used to crack down people’s websites, personal accounts, but now, it is more about getting a bigger fish. The spheres that undergo significant stress from cyberattacks at the moment include small businesses, the world economy, and even politics.

Despite such an issue, malicious programs penetrate governmental and business institutions’ systems and personal computers of the average citizens. It is time to think about your protection in advance. Consider this article to see how software can fit your needs, and seek some information regarding the features you need right now. 

These are the options aimed at protection against common risks the business and political spheres experience nowadays. In this regard, let’s give it a more detailed look. 

Small Businesses Are at Risk

Returning to the attacked spheres, let us see what meaningful cybercriminals happened lately and how these affected the world. Nearly every organization, whether it is a corporation or a small company, is a target for cyberattacks. Yet, if you own a small business, you should realize that this sector is the most vulnerable and whether you want it or not, you will have to spend some money on durable antivirus software.

The most important thing to keep in mind is that cybercriminals are not standing at one point; the attacks become more sophisticated and sly with every month. Modern antiviruses can detect a damaging file, yet, not all the programs can protect you from a newly created virus. For that reason, it would be fair to conclude that sooner or later, every small company is likely to be attacked. 

The devastating effects of the hacking process are also related to the fact that online criminals foresee a particular company’s risks. In other words, they study the organization from the inside, focus on the spots that can be a perfect breach target, and do whatever it takes to hack the security system. Besides, keep in mind that there is always a probability of an insider threat.

Why is the Segment so Luring?

As of the moment of 2020, 28% of the attacks engage small businesses. It seems to be a considerable percentage and makes you wonder what makes the segment so vulnerable in other businesses’ background. The answer was the level of protection. Big corporations cannot close their eyes and keep up with popular antiviruses due to the scope of possible consequences. 

Cybercriminals tend to diversify and broaden the field of their activity. So, you should be ready to guard your company, employees, and, of course, your customers. Interestingly, the motives of the criminals are financial (83%), concerned with espionage reasons (8%), and also fun-related (3%). Some of the typical attacks are as follows:

  • Spyware – 46%
  • Captures of stored data – 34%
  • Stolen credit cards usage – 30%
  • Export data – 29%
  • Backdoor – 28%
  • Backdoor and C2 usage – 26%
  • Phishing – 22%, etc.

Note that weak passwords are not a good thing for businesses as well, and hackers will use your neglect of the rules. Weak passwords are often an issue that comes from employees’ negligence. All actions that come from the inside of the organization – employees, associates’, or business contractors’ activity – comprise an insider threat. 

The insider threat appears when the contributors access sensitive data and use it for the sake of their benefit. On the other hand, data leak may happen due to a person’s inattentiveness and unprofessionalism. You should be aware of similar cases to know the root of the problem in such situations, where an insider brings down the entire company.

Economic Impact

With a growing interest in poorly protected small businesses, global security risks grow as well. Eventually, it poses serious problems for the economy of the entire world. The Global Risks Report 2020 suggests that until 2030, cybercrime will be the second top risk in the global commerce sphere. It sounds odd, considering the technologies people already know. The trouble is that hackers are smart as whips.

Sure, the small companies suffer the most, yet the big corporations also face impactful attacks, and this problem will not disappear. Big organizations have branch offices and sub-companies around the world. And if one of the institutions is affected, then the revenue, reputation, and profits of the entire organization are going to experience tough times.

Public Impact

Another indicator of small business’ vulnerability is the attacks on the public sector and governmental entities. It may seem somehow evident that cyber criminals should be interested in governmental organizations and accounts of politics. The reasons for such breaches vary from personal benefits, like confidential information extortion, to political views.

Lately, the entire world watched the combat between Donald Trump and Joe Biden; one of the discussed topics was the possibility of ransomware attack during the elections. Especially considering the way how high the emotions ran during the debates of both candidates.

The reason for such a concern is related to an incident back in 2016 when ransomware hit the state of Florida election. Despite the recovery, the event left an unpleasant aftertaste, and people realized that it was no joke – cyber attacks were genuine and could undermine the system. If such a public sector is vulnerable, then small businesses are under hit as well.   

In this regard, the reaction created by previous ransomware was impressive. Aside from political institutions, Americans saw the issue from a broader perspective: if elections were that easy to compromise, it would be even easier to compromise banking companies, hospitals, etc. 

Bottom Line

As you can see, cybercriminals are not a thing to ignore. People could have noticed a more rapid growth in the number of attacks during the last decade. There are three sectors in the limelight of an issue: small businesses, the global economy, and politics. In most cases, the reason for a breach was a human mistake, weak security measures, or unawareness of the problem’s seriousness.

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


A simple way to understand the statistical foundations of data science

MMS Founder
MMS RSS

Article originally posted on Data Science Central. Visit Data Science Central

Introduction

There are six broad questions which can be answered in data analysis according to an article called “What is the question?” By Jeffery T. Leek, Roger D. Peng. These questions help to frame our thinking of data science problems. Here, I propose that these questions also provide a unified framework for relating statistics to data science.

 

The six questions according to Jeffery Leek and Roger Peng are

A descriptive question seeks to summarize a characteristic from a dataset. You do not interpret the results. For example: number of fresh fruits and vegetables served in a day.

 

An exploratory question is a question where you try to find a pattern of a relationship between variables i.e. you aim to generate a hypothesis. At this stage, you do not test the hypothesis. You are merely generating a hypothesis. More generally, you could say that you are proposing a hypothesis which could hold in a new sample from the population. 

 

An inferential question restates the proposed hypothesis in the form of a question that would be answered by analyzing the data. You are validating the hypothesis i.e. does the observed pattern hold beyond the data at hand. Most statistical problems are inferential problems.

 

A predictive question would be one where you predict the outcome for a specific instance.

 

A causal question: Unlike the predictive and inferential questions, causal questions relate to averages in a population i.e. how changing the average in one measurement would affect another. Causal questions apply to data in randomized trials and statistical experiments where you try to understand the cause behind an effect being observed by designing a controlled experiment and changing one factor at a time.

 

A mechanistic question asks what is the mechanism behind an observation i.e. how a change of one measurement always and exclusively leads to a deterministic behaviour in another. Mechanistic questions apply typically to engineering situations.

Implications for Data Science

The six questions framework raise awareness about which question is being asked and aim to reduce the confusion in discussions and media. However, they also help to provide a single framework to co-relate statistics problems to data science. 

Re the six questions:

  • Descriptive and exploratory techniques are often considered together
  • Predictive and Inferential questions can also be combined

So we could consider four questions:

  • Exploratory
  • Inferential
  • Causal
  • Mechanistic

 

Why this framework matters?

That’s because it provides questions which may not have been encountered before. Here are three examples

 

Conclusion

The six questions provide rigor and simplicity of analysis. I also find that these questions provide a comprehensive set of questions that link statistics to data science. They help you to think beyond the norm i.e.  beyond problems that you encounter to consider all possible problems.

 

References

What is the question? By Jeffery T. Leek, Roger D. Peng

image source

pixabay

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


Github Removes All Non-Essential Cookies

MMS Founder
MMS Bruno Couriol

Article originally posted on InfoQ. Visit InfoQ

GitHub recently announced having removed all banners from GitHub. GitHub additionally commits to only use in the future cookies that are essential to serving GitHub.com.

By Bruno Couriol

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


Presentation: Continuous Resilience

MMS Founder
MMS Adrian Cockroft

Article originally posted on InfoQ. Visit InfoQ

Adrian Cockroft talks about how to build robust systems by being more systematic about hazard analysis, and including the operator experience in the hazard model.

By Adrian Cockroft

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


Presentation: User Stories: Re-Explained – You Think You Know until You Realise You Don't

MMS Founder
MMS Antony Marcano

Article originally posted on InfoQ. Visit InfoQ

Antony Marcano discusses using User Stories, tasks and features in disguise to release more value, sooner, with more flexibility and without dependencies.

By Antony Marcano

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.


AWS Introduces Batch Support for AWS Fargate

MMS Founder
MMS Steef-Jan Wiggers

Article originally posted on InfoQ. Visit InfoQ

During the first week of the annual re:invent, AWS introduced the ability to specify AWS Fargate as a computing resource for AWS Batch jobs. With the AWS Batch support for AWS Fargate, customers will have a way to run jobs on serverless compute resources, fully-managed from job submission to completion.

By Steef-Jan Wiggers

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.