Back to List

Two Megatrends Driving the Coming Data Tsunami (Are You Ready?)

Tim Morrow Tim Morrow  |  
Oct 29, 2019
Data is growing and proliferating at an unprecedented rate. According to a 2016 report from IBM, "90% of the data in the world today has been created in the last two years alone" - and that was three years ago. Imagine how much more data we have today. 

One of the definitions of a tsunami is "an arrival or occurrence of something in overwhelming quantities or amounts." Such is the current state of data that our organizations produce and have access to, and it’s only going to grow exponentially in the coming years.
The advent and growth of two key data megatrends will help usher in an even greater windfall of data to be harnessed for customer insights and strategic differentiation.

Data Mega Trend #1: 5G Mobile Networks

The data traffic generated by smartphones has been rising exponentially over the past several years. However, with the first 5G phones just now hitting the market, the data tsunami is just starting to gather momentum.
From the chart above, we see that between 2016 and 2022, the traffic generated by smartphones is estimated to increase tenfold. By 2022, 5G will still be in its early stages, and the full impact of 5G will just be getting started. We likely won’t see the real impact for another five years after that.
Estimates say that 80% of our content consumption by 2020 will be video rather than text. Everything about the way we interact with the digital world will quickly change.

Data Mega Trend #2: Internet of Things (IoT)

The other megatrend driving the data tsunami is the explosion of connected devices – commonly referred to as the "Internet of Things" (IoT). In roughly five years, estimates say there will be an average of 10 connected devices per person on this planet. 
Data will come from many places and in many forms. The two most obvious data drivers are mobile devices and connected devices. It’s crucial that this data be turned into useful information and insights.
Traditional databases and data warehouses are not built to support this explosion of data. The need to implement modern cloud-based data warehouses is critical to capitalizing on insights that can be gained through this data.

Cloud data warehouses have unlimited resources and can scale up or down on demand. Organizations only pay for what they use. There is no need for up-front capacity planning and large infrastructure investments when organizations can make capacity decisions on the fly by utilizing cloud resources.

Azure Cloud Migration Project

Skyline recently helped a Top 500 Fortune franchise migrate their mission-critical Point-of-Sale (POS) and eCommerce websites to Microsoft's Azure Cloud. (Watch our webinar on website Azure migrations)
As a result of this migration, this client is now enjoying advanced monitoring and actionable insights into the performance of these critical applications. They receive real-time alerts and notifications when performance is below certain thresholds so they can scale-up performance of these critical applications when needed to better support their franchise owners.

Outdated Manual Data Management Practices 

Many companies are using outdated and time-consuming approaches to manually managing their data. The IDC estimates that 6 billion hours per year are spent working in spreadsheets. That’s 26 hours wasted per week (especially since 8 of those hours are from needlessly repetitive tasks). This translates to approximately $60 billion per year in lost productivity due to manual and repetitive work.
The pervasive use of "copy/paste" is a key contributor to errors and inefficiencies in data preparation. There is tremendous opportunity for modern data management practices to be employed that can streamline and automate many of these repetitive tasks.

Data Modernization Project

Skyline was recently engaged by a large adhesives company to automate and streamline their U.S. Sales Reporting practices. Our Data Analytics and Data Platform team created a tabular/dimensional model of their U.S. sales data to load daily files into an enterprise data warehouse.

Utilizing SQL Server Integration Services (SSIS), we were able to automatically produce month-end sales reports that typically took a data worker at this organization 8-10 hours to produce each month. We were also able to create custom dashboards in Power BI that helped reveal actionable insights for the business across several critical metrics. (Watch our Power BI Dashboard-in-an-Hour training webinar)

It’s critical to create and implement a data culture

One of the most over-looked reasons for failure in big data projects (estimated at 85% by Gartner) is that organizations don’t pay proper attention to creating and implementing a data culture. Lack of management understanding, organizational alignment, and general organizational resistance are often the culprits of failed data management projects. If only people were as malleable as data!
We also spent a lot of time cross-training key users at the adhesives company on the technology we had implemented so they could support and spread it throughout their organization. Our Business Analyst on the project worked as a “translator” to make sure the business users understood the new functionality being implemented and that it delivered the maximum business value. Plus, our Scrum/Agile approach allowed us to frequently demonstrate the data’s powerful insights to gain organizational buy-in and greater adoption.


Data modernization is at the core of digital transformation efforts and will be one of the key differentiators separating companies in today's digital economy. Data is your organization’s most untapped asset, and your ability to better leverage it can be a huge strategic advantage. If you want ideas for how you can maximize your data’s potential, we’d love to talk.


Love our Blogs?

Sign up to get notified of new Skyline posts.


Related Content

Blog Article
Power BI’s Latest Features and How to Use Them
Marcus RadueMarcus Radue  |  
Apr 14, 2020
[Updated 05/11/20]  In this regularly updated blog, Marcus Radue (Data Analytics Engineer) highlights key features from his monthly Power BI Office Hours webinar series so you know how to capitalize on Power BI’s latest enhancements.   Power BI Features (April 2020 Update)
Blog Article
What is Microsoft’s Power Platform and How to Use It: The Guide
Skyline Technologies  |  
Jan 14, 2020
In this guide, Libby Fisette (Director of Skyline Modern Workplace team) and Marcus Radue (Data Analytics Engineer), dig into the functionality of the Microsoft Power Platform and how you can leverage this toolset to solve many business situations. From basics to key questions, you will find...
Blog Article
Realtime and Near-Realtime Data Sources and Data Integrity
Matt PlusterMatt Pluster  |  
Dec 17, 2019
In this blog series, Matt Pluster, Director of Skyline’s Data Analytics Consulting Practice, explores data sources and processing options. For a full overview on this topic, check out the Realtime vs Near-Realtime webinar.   In previous blogs in this series, I dug into advantages...
Blog Article
Mitigating the Risks of Realtime or Near-Realtime Data Processing
Matt PlusterMatt Pluster  |  
Dec 10, 2019
In this blog series, Matt Pluster, Director of Skyline’s Data Analytics Consulting Practice, explores data sources and processing options. For a full overview on this topic, check out the Realtime vs Near-Realtime webinar.   In previous blogs in this series, I’ve talked about...
Blog Article
“The Other Realtime”: Low-Latency Data Processing via DirectQuery
Matt PlusterMatt Pluster  |  
Dec 03, 2019
In this blog series, Matt Pluster, Director of Skyline’s Data Analytics Consulting Practice, explores data sources and processing options. For a full overview on this topic, check out the Realtime vs Near-Realtime webinar.    So far in this blog series, we have talked...