Future of Data Analysis: Taking Visualizations to the Next Level

Written by JC Gonzalez

There is no denying the future of data analysis is bright. The industry valued at $23 billion in 2019, is expected to reach a value of $132 billion by 2026. Advances in technology promise to finally get meetings away from static spreadsheet presentations and two-dimensional bar graphs. Let’s take a look at where data analytics is today and where it is heading. 

https://www.youtube.com/watch?v=RrgrLFZD7jE

Before we explore the future, we’ll take a look at how data analytics began and how it got to where it is today in this era of big data. We will finish with examples of companies already pushing data visualization further so you can see what is possible. You may find yourself looking to take your company’s data out of the spreadsheet and into your shared space. 

The Beginning

John Tukey

The year is 1962. John Tukey helps give birth to the field of data analytics when he publishes, “The Future of Data Analysis.” In it, Tukey predicts the role technology would play as the field – and our world – evolves.

John Tukey photographed for Life Magazine. Tukey is credited with starting the field of data analytics.
Image Source

Who Will Use Data?

Tukey writes of the importance for the average person to interpret data. He writes of automation being necessary to help people who are not experts: 

Most data analysis is going to be done by people who are not sophisticated data analysts and who have very limited time; if you do not provide them tools the data will be even less studied. Properly automated tools are the easiest to use for a man with a computer.”

He follows that statement by addressing the needs of the experts: 

If sophisticated data analysts are to gain in depth and power, they must have both the time and the stimulation to try out new procedures of analysis; hence the known procedure must be made easy for them to apply as possible. Again, automation is called for.

Tukey understands different people will need different things from their technology and that the majority of people using data in their decisions would not be data analysis experts. Somewhere along the way as we collected more data, some companies forgot that. 

This is why we now have companies with incredible amounts of data they don’t know what to do with and often don’t even see. It is estimated that up to 73% of data collected goes unused for analytical purposes. 

How Will We Teach Data Analysis?

Tukey also speaks of a new way of teaching data analytics: “We would teach it like biochemistry, with emphasis on what we have learned, with some class discussion of how such things were learned perhaps, but with relegation of all questions of detailed methods to the ‘laboratory work.’” 

Tukey believed you should first learn what the data tells you and then learn how to get to that point through practice. It’s a learn-by-doing approach that many schools refused to accept until, ironically, the data showed them the benefits of this style of learning. 

In Tukey's work, he explains the best way of teaching data analysis is what we now call learning by doing
Image Source

Tukey closes by challenging the 1962 reader as follows: 

The future of data analysis can involve great progress, the overcoming of real difficulties, and the provision of a great service to all fields of science and technology. Will it? That remains to us, to our willingness to take up the rocky road of real problems in preference to the smooth road of unreal assumptions, arbitrary criteria, and abstract results without real attachments.

International Association for Statistical Computing (IASC)

The International Association for Statistical Computing (IASC) was founded in 1977. Their goal is to “promote the theory, methods, and practice of statistical computing” and to instill interest in the field. 

The IASC gave Tukey’s claims a higher level of validity. The association’s existence also reinforced early data analysis theories from pioneers in the field like Peter Naur. 

In 1974, Naur published the book, Concise Survey of Computer Methods. In it, Naur outlines data processing methods used in many applications.  

Knowledge Discovery in Databases (KDD) and Data Mining process

The Knowledge Discovery in Databases process, or KDD, was developed by 1989. Data mining is often used in addition to or in place of KDD but it is actually one of the steps in the KDD process.

The KDD process refers to the five steps taken in data analysis to go from raw data to the point where people can make decisions based on what the data shows
Image Source

The steps of the KDD process are as follows: 

  1. Selection: This is when you identify the target data and the variables used to assess it. 
  2. Pre-processing: This crucial step is data cleanup. The goal is to identify faulty or mismatched data here to correct it before it distorts results. 
  3. Transformation: At this point, analysts organize, sort, and unify all data into a single type.
  4. Data mining: This is when you use the data to create graphs and charts presented as the results of data analysis. 
  5. Interpretation or evaluation: Decision-makers interpret the results and determine a path forward.

International Federation of Classification Societies (IFCS)

Founded in 1985, the International Federation of Classification Societies (IFCS) is a “non-profit, non-political scientific organization” focused on classification research. Among other activities, the IFCS supports the journal Advances in Data Analysis and Classification (ADAC).

The ADAC publishes articles on structural, quantitative, or statistical approaches for the analysis of data. It also reviews strategies for mining large data sets, methods for knowledge extraction from data, and applications of advanced methods in the field. 

BusinessWeek Publishes “Database Marketing”

BusinessWeek publishes an article titled Database Marketing detailing the data being collected in 1994
Image Source

In 1994, BusinessWeek magazine published the cover story article “Database Marketing.” The article details how companies are collecting massive amounts of data on competitors and customers. The issue behind the story was the compiled data actually overwhelmed most companies. 

Developments of the 2000s

The 90s marked the point when companies started to collect data. As a result, many of the products released in the 2000s right up to today have data collection abilities built in as primary functions. Products released in the first decade of the 2000s included the following: 

  • Google AdWords (2000)
  • Android (2003)
  • Facebook (2004)
  • YouTube (2005)
  • Amazon Web Services (2006)
  • Apple iPhone (2007)
  • Amazon Kindle (2007)

Developments of the 2010s and Big Data

Enter the 2010s. When we entered 2010, the world had created 2 zettabytes of data. By 2019, annual data volumes reached 41 zettabytes. The era of big data had arrived. 

Product development continues the trend of data collection. The difference is by this point, data collection products take the form of software or applications. Hardware products now support specific applications. For example, the Amazon Echo serves as the device that supports Amazon Alexa as a digital assistant. 

The future of data analysis depends highly on artificial intelligence compiling the large amounts of data collected today
Image Source

Leading the Wave of Change

Artificial Intelligence

With so much data today, it would be nearly impossible to make sense of it on our own. AI and machine learning are vital to interpreting data. Take Excel, for example. The ubiquitous spreadsheet application comes equipped with AI features now available to the average person. 

Here are three examples of AI in Microsoft Excel (Office 365 version):

  • Flash Fill: Flash Fill recognizes patterns in data as you enter it. Enter a few values in a column and if you agree with the suggestion, hit “enter” to accept the remaining values. The limitation of flash fill is that it fills in the values, not the calculation. This means if you go back and make changes to the data after doing a flash fill, those other values remain the same. 
  • Column from Examples: This is an actual command you ask of Excel. When you activate it, you get a new column. In the top cell, you give it the example of what you want. Excel then creates a formula based on your example and applies it to the entire column. This means after accepting it, the column remains dynamic. If you change data in a related cell, the new column will change accordingly. 
  • Analyze Data: This is Excel’s AI showing off what it can now do. After all your data is in place, one click will give you a sidebar with a long list of reports Excel creates by analyzing your data. It even presents these reports in different types of charts based on what it believes fits the data best. 

Almost every industry you interact with today uses AI. The era of big data would not be possible without help from computers who can make sense of it and get it to a more manageable place. 

The future of data analysis includes bringing data to the average person. Cars are using augmented reality to present precise data to drivers.
Image Source

Augmented Reality

Some believe we are more likely to live in a world where AR is the norm while VR is used at specific times based on need. Rather than disappearing into a virtual world, we can interact with AR in the space we already navigate with the benefit of not running into the wall. 

The best example of data in AR is probably in cars. Automakers like Mercedes Benz are adding vital performance metrics to an AR dashboard. Directions to your destination, vital road information, along with vitals such as speed, gas, and mileage, can now be displayed at eye level. It’s real-time data exactly where you need it.  

One thing to consider is that when it comes to data, the environment where it is seen matters. AR is designed to present digital information over a real space using your phone, tablet, or AR glasses. Trouble is, not all data plays well with the real space. 

Presenting data graphs in AR can get confusing because the data points float over our real environment. One way to minimize confusion is by forcing the data to only appear when a tablet or phone over a set surface like a conference room table.
Image Source

One way to solve real-space limitations is to define a surface as the AR space. For example, a table in your main conference room can be the place where AR data is always presented. In this way, you can interact with the data but it isn’t floating in thin air or getting lost in a busy office or window landscape. 

Virtual Reality

Data visualization, even when carefully organized, can get daunting. A single presentation can include several graphs with lots of plot points and variables. While AR allows you to remain in your natural space, data visualization in today’s world of big data may require a deeper level of immersion.

Virtual reality brings you into a controlled environment. You now have a captive audience blind to any distractions outside the presentation. In this space, you can focus all attention on the data. 

Since you can now walk around, go inside, float above, or even shrink the data so that it fits in your hand, you can add multiple dimensions to it. A single bar graph can now include several comparison points based on its color, brightness, height, width, and proximity to other bars. 

The future of data analysis seems to be fully immersive. This complete VR data presentation was created within D6 VR
Courtesy of D6 VR

Another benefit is the ability to use the same presentation even when presenting to different parties. Is this an executive overview? No need to go into the various graph points. You can zoom out of the data and show an overview. 

If the presentation is for the sales staff, you can dive deeper to show top products and where they sold well. Sales people can now determine the areas requiring an additional push or areas better served with a different product category. 

The Future is Immersive

Between the amounts of data being compiled and analyzed by AI and the advancements in XR, it’s clear the future of data visualization is immersive. The technology many dismissed as only a gaming advancement is now showing everyone its limitless professional possibilities. 

Acceptance of VR

If one positive can be drawn from the ways in which COVID-19 changed our world, it is the acceptance of remote work. In the earliest days of companies working remotely, employees and managers used communication tools like Slack and Microsoft Teams. 

Zoom fatigue became a very real condition as the pandemic forced all meetings into a desktop space
Image Source

Businesses quickly realized chat messages would not be enough to keep projects going and the world shifted to video meetings. Zoom became synonymous with “meeting” and later, Zoom fatigue became a real mental condition. 

As social beings, we thrive when we are in the same space as others. Sitting in front of a computer screen is not the same, even if we can see the other person. Virtual reality enables you to again use certain body cues: hand gestures, turning to face the speaker, and even respecting others’ personal space

Training in VR

Embracing gamification, companies noticed benefits to training in VR. The classic VR training experience is the flight simulator. But why do we have to stop there? Today, companies are using VR to teach employees soft skills, customer service, auto manufacturing, law enforcement, fire fighting, and much more. We have covered several companies already using VR in training in a different blog post you can check out here

The future of data analysis in VR provides trainers with engagement data not available with other methods. One example is eye tracking.
Image Source

A key benefit of training in VR for the companies that adopt it, is the data they receive from it. When training on a computer, all you know is how long your employees are spending on each session. With VR, however, employers now have access to superior engagement metrics

Thanks to VR eye tracking, employers can get data on what elements of the training held employees’ attention longer. Another metric employers can track is body movement. If the training involves employees learning physical processes, body movement is key. 

Meetings in VR

The next step for VR in the professional space is meetings. Companies built to be remote are more likely to adopt VR meetings early. Some Edstutia meetings take place inside the Edstutia campus. 

If your company is eager to get everyone back in the office and continues to use statements like “back to normal,” adopting VR may take a little longer. This decision may prove detrimental in the long run. Robert Lambrechts, chief creative officer at ad agency Pereira O‘Dell said the following, “I don‘t think there‘s ‘going back‘ to anything. That world, whatever we did in January 2020, doesn‘t exist anymore.”

As more companies take their meetings into virtual reality spaces, presenting data in VR will become the norm
Image Source

The move to VR meetings is only starting but it is not going to go away. As Web 3 technology develops, moving into virtual spaces for professional situations will become more commonplace. 

Immersive Data Analytics Businesses

D6 VR

Founded in 2017, D6 VR offers an immersive environment in which to present data plus all the necessary tools to create full 3D presentations in this space. Their mission is to apply cutting-edge technologies such as VR/AR and AI to revolutionize the way we visualize, present, and collaborate.

Si Yang, general manager at D6 VR said the following, “D6 is an innovative and inevitable next-gen solution to visualize and tell your data story beyond the limits of reality. It is your secret weapon to win customers and effectively deliver your message to your organization leaders like never before.

Aroaro

Aroaro defines immersive analytics as the use of engaging, embodied analysis tools to support data understanding and decision making. Their mission is to develop a better immersive world to reimagine data-driven innovation.

Aroaro is one of the many companies working on the future of data analysis. Their mission is to develop a better immersive world to reimagine data-driven innovation.
Image Source

Immersion Analytics

Immersion Analytics began as a search for an effective way to make sense of relationships between more than a few columns of data. They’ve developed a technology that enables visualization of up to 18 dimensions for each data point on a single graph. 

BadVR

Located in Los Angeles, California, BadVR looks to empower people of all technical skill levels by creating the tools they need to make the right decisions. The company is dedicated to “revolutionizing the field of data analytics by building an immersive platform that makes it easy for everyone to work with data.”

Virtualitics

Focused on the AI necessary to create meaningful data reports, Virtualitics provides an “advanced analytics solution that empowers everyone with faster ready-to-use AI that can be understood–by analysts and business leaders alike.”

Virtualitics is another example of companies working on the future of data analytics, providing an analytics solution driven by ready-to-use AI.
Image Source

Future of Data Analysis Wrap-Up

Data analysis can get overwhelming. Each year, companies collect more data than the previous year with no sign of slowdown in the pattern. One way to ease the complexity of interpreting and presenting data is to take it out of a 2-D page. The future of data analysis is immersive. Edstutia’s upcoming track on Integrated Marketing & Analytics will highlight immersive data visualization in marketing. 


In order for your team to move toward immersive data analysis, they must understand what it’s like to work, present, and teach in VR. Edstutia’s VR certification can provide employees the confidence to take a company into Web 3 and take advantage of its benefits. Contact us to learn more about upcoming modules offered on the Edstutia virtual campus.