Table of contents
- What is data analytics?
- Types of data analytics
- 7 key models for data analytics
- Automation and artificial intelligence: Key maturity stages
- Top 10 best practices for data analytics in 2023
Data analytics is defined as the capability to apply quantitative analysis and technologies to data to find trends and solve problems. As volumes of data grow exponentially,data analytics allows enterprises to analyze data to improve and expedite decision-making.
Within the technical and business realms, however,“data analytics,” especially, has taken on a narrower and more specific meaning. It has come to describe the newer, algorithmic analysis of “big” and often unstructured datasets that go beyond, for example, the financial and entity-based business records that have long informed traditional business intelligence (BI) and analysis.
A recent International Data Corporation (IDC) survey found that companies that best use digital analytics tools and processes see business outcome improvements that average 2.5 times those for lagging organizationsʼ for six of 12 top business outcomes studied. Not surprisingly, IDC also reports that enterprises spend heavily on their big data and analytics capabilities, finding that global spending, broadly defined, reached $215.7 billion in 2021.
What is data analytics?
Data analytics tends to be predictive, and it enables many new capabilities, including the iterative refinement of algorithms for the machine learning (ML) that drives much artificial intelligence (AI). It is also significantly augmenting BI and decision-making across organizations.
Companies are bringing in data managers, setting new policies and using solutions like Snowflake to collect huge amounts of information (structured, semi-structured or unstructured) flowing in from sources within and beyond their organizations.
The goal is to drive value from these growing volumes of data, but collection alone is not enough to do that. Data is often compiled in a raw form (tables, graphs, log files) that doesn’t provide any value without processing. This is where data analytics comes in. Raw data collected from various sources is analyzed to pull out insights that are useful to companies and can help drive critical business decisions.
Data analytics is usually performed by data analysts (and sometimes data analytics engineers). They look at the entire jigsaw puzzle of data, make sense of it (through cleaning, transforming, modeling) and eventually identify relevant patterns and insights for use by the company. They may also create dashboards and reports that less technically trained business analysts use in their work. (In larger organizations, data engineers and data analytics engineers may assemble and support the data systems used by these analysts.)
Data analytics are widely applied within the healthcare sector, for example. Large amounts of actual patient data are compiled and crunched to identify:
- The frequency of medical diagnoses and treatments and procedures
- The efficacy of such treatments and procedures
- The profitability of treatments and procedures by demographics, region and type of facility
For each area studied, findings may be generated to:
- Describe the past
- Predict the future
- Recommend approaches for optimizing outcomes
Types of data analytics
Depending on the level of implementation, data analytics can be classified into four types:
1. Descriptive analytics
Descriptive analytics enables organizations to understand their past. It gathers and visualizes historical data to answer such questions as “what happened?” and “how many?” This gives enterprise users a way to measure the result of decisions that have already been made at the organizational level.
2. Diagnostic analytics
While descriptive analytics provides a baseline of what has happened, diagnostic analytics goes a step further and explains why it happened. It explores historical data points to identify patterns and dependencies among variables that could explain a particular outcome.
3. Predictive analytics
Predictive analytics uses the knowledge of the path from descriptive analytics to tell what is likely to happen in the future. For example, predictive analysts can use historical trends to forecast what might be the business outcome of increasing the price of a product by 30%. It largely involves predictive modeling, statistics, data mining and advanced analysis.
4. Prescriptive analytics
Prescriptive analytics, as the name suggests, goes one step further and uses machine learning to empower enterprises with suitable recommendations to drive desired results. It can help with better operating the company, increasing sales and driving more revenue.
For example, these types of analytics could be deployed in a corporate finance department in the following ways:
- Descriptive analytics (also known in this context as “business intelligence”) might inform internal monthly and quarterly reports of sales and profitability for divisions, product lines, geographic regions, etc.
- Diagnostic analytics might dissect the impacts of currency exchange, local economics and local taxes on results by geographic region.
- Predictive analytics could incorporate forecasted economic and market-demand data by product line and region to predict sales for the next month or quarter.
- Prescriptive analytics could then generate recommendations for relative investments in production and advertising budgets by product line and region for the coming month or quarter.
7 key models for data analytics
When it comes to actually analyzing data to identify trends and patterns, analysts can use multiple models. Each one works differently and each provides insights for better decision-making.
- Regression analysis: This model determines the relationship between a given set of variables (dependent and independent) to identify crucial trends and patterns between them. For example, an analyst can use the technique to correlate social spending (an independent variable) with sales revenue (a dependent value) and understand what the impact of social investments on sales has been so far. This information can ultimately help management make decisions regarding social investments.
- Monte Carlo simulation: Also known as multiple probability simulation, a Monte Carlo simulation estimates the possible outcomes of an uncertain event. It provides enterprise users with a range of possible outcomes and the likelihood of each one happening. Many organizations use this mathematical method for risk analysis.
- Factor analysis: This technique involves taking a mass of data and shrinking it to a smaller size that is more manageable and understandable. Organizations often reduce variables by extracting all their commonalities into a smaller number of factors. This helps uncover previously-hidden patterns and shows how those patterns overlap.
- Cohort analysis: Under cohort analysis, instead of inspecting data as a whole, analysts break it down into related groups for analysis over time. These groups usually share some common characteristics or experiences within a defined timespan.
- Cluster analysis: Cluster analysis involves grouping data into clusters in such a way that items within a cluster are similar to each other but completely dissimilar when compared to those in another cluster. It provides insight into data distribution and can easily help reveal patterns behind anomalies. For instance, an insurance company can use the technique to determine why more claims are associated with certain specific locations.
- Time-series analysis: Time-series analysis studies the characteristics of a variable with respect to time, and identifies trends that could help predict its future behavior. Imagine analyzing sales figures to predict where the numbers will go in the next quarter.
- Sentiment analysis: This technique identifies the emotional tone behind a dataset, helping organizations identify opinions about a product, service or idea.
Automation and artificial intelligence: Key maturity stages
While most organizations realize the value of data analytics, many have yet to achieve full implementation maturity. To help understand this, Gartner has detailed five levels in its maturity model for data and analytics.
- Basic: This is the initial stage of maturity, where data and analytics efforts are managed in silos, focusing largely on backward-looking events (e.g., last quarter’s revenue) using transactional data and logs. However, in this case, analytical processes are performed on an ad hoc basis, with little to no automation and governance. Analysts have to deal with spreadsheets and large volumes of information.
- Opportunistic: At this level, organizations begin to focus on meeting broader information-availability requirements for business units (departmental marts) and setting up parameters to ensure data quality. However, all these efforts remain in silos and are affected by culture, lack of suitable leadership, organizational barriers and slow proliferation of tools. The data strategy also lacks business relevance.
- Systematic: In organizations at this third stage, executives become data and analytics champions. They bring a clear strategy and vision to the table and focus on agile delivery. As part of this, data warehousing and business intelligence (BI) capabilities are adopted, leading to more central data handling. However, even at this level, data is not a key business priority.
- Differentiating: At this stage, data starts becoming a strategic asset. It is linked across business units, serving as an indispensable fuel for performance and innovation. A CDO (chief data officer) leads the entire analytical effort, measuring ROI, while executives champion and communicate best practices. Notably, the system still carries governance gaps, and AI/ML’s use is limited.
- Transformational: An organization at the transformational level has implemented data and analytics as a core part of its business strategy, with deeper integration of AI/ML. Data also influences the organization’s key business investments.
According to former Gartner VP and analyst Nick Heudecker, “Organizations at transformational levels of maturity enjoy increased agility, better integration with partners and suppliers, and easier use of advanced predictive and prescriptive forms of analytics. This all translates to competitive advantage and differentiation.”
Additionally, through multiple 2022 surveys, IDC has charted organizations’ data analytics capabilities and benefits within a four-stage maturity model.
Top 10 best practices for data analytics in 2023
Focus on these best practices to implement a successful analytics project:
1. Improve how people and processes are coordinated
Before bringing in novel tools and technologies for analytics, you should focus on better coordinating people and processes within your organization. Part of this is breaking down silos and promoting a culture where data is central to business goals and readily accessible. There should be a single source of truth and no fighting over information.
2. Start small with a clear objective
After coordinating people and processes, you should determine what they want to achieve with the available information. There can be multiple goals, but prioritizing is important to make sure resources are deployed in the best possible way, for maximum ROI. Also, with a clear goal, users can stay clear of data types and tools that are not needed.
3. Audit critical capabilities
Organizations should also conduct an audit of analytics-critical capabilities, including: the ability to measure performance metrics as per set goals, the ability to create predictive models, and the quality and completeness of the data needed.
4. Focus on scalability
When selecting a data analytics tool, make sure to consider scalability. This will ensure that your tool continues to deliver even when your data volumes, depth of analysis and number of concurrent users grow exponentially.
5. Tie in compliance
It’s also important to connect compliance with data analytics. This can help you make sure your users are following government rules and industry-specific security standards when dealing with confidential business information.
6. Refine models
Since business data is continuously changing, the models used to analyze the information should also be refined over time. This way, a company can make sure to keep up with the dynamic market environment.
7. Standardize reporting
Focus on standardizing report-producing tools across the organization. This can ensure that the reports and visualizations produced after analysis will look similar to all users, regardless of their department. Multiple reporting formats often lead to confusion and incorrect interpretation.
8. Data storytelling
While visualizations can provide sufficient information, organizations should also focus on making things more accessible through data storytelling. This can help every business user, including those who don’t have analytical skills, use insights for decision-making. Tableau is one vendor providing data storytelling capabilities for analytics consumption.
9. Set up training and upskilling
In order to drive maximum value from data, maintain your data culture across the organization. You can do this through two-way communication, and through educating employees about data’s value and how they can use it to drive better results.
10. Monitor model performance
Data can get stale over time, leading to issues with a model’s performance. This can be avoided if the organization watches over this performance on a regular basis. To exploit current capabilities and maintain competitiveness, however, this increasingly requires systems and support from your enterprise’s data science and data and AI engineering teams.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.