Share with your friends


Analytics Magazine

Viewpoint: Understanding the challenges and opportunities of big data

July/August 2012

Dan-Joe BarryBy Dan-Joe Barry

You’ve probably heard a lot about dig data, largely as a result of the fact that the technology – in the shape of ultra-fast processors and data interfacing systems – has come of age, meaning that companies can harness the power that big data brings to the better technology table.

However, plenty of confusion persists about what dig data is and how it helps the average hard-pressed company professional.

At its most basic, big data is an umbrella term for any pro-active use of available data for the purposes of improving services and customer satisfaction. In this context, a major focus has been placed on data warehousing and data mining for better analytics on a company’s customers, as well as their product or service consumption.

The underlying premise is that the data required for analysis is available to the company concerned and is in a format that is easily accessible. The data should also, of course, be reliable enough to support analytics.

But wait – as the TV advert says – there’s more, as big data generates information that analysts call BI (business intelligence), which, unlike the raw materials used in manufacturing processes, can be used, re-used and re-used again.

BI is now a must-have feature of modern management. A 2011 IBM survey found that 83 percent of chief information officers view BI as their top priority for enhancing competitiveness.

Until just a few years ago, businesses tended to limit and even block the data they supplied to people outside of their day-to-day environment and bring information inside. But the arrival of big data – and the raw BI it generates – allows companies to do the reverse and share their inside data with customers and see what they do with it.

This is, in essence, how the more efficient businesses communicate with their customers on social networking site and services such as Facebook and LinkedIn.

As a result, many organizations are finding that a high percentage of BI now resides outside the structured environment, meaning that businesses have to change the methodology by which they get data, which can pose significant technical challenges. Assuming these technical challenges can be overcome and the underlying big data supporting the organisation’s information resource is reliable, then we can start to crunch the available information.

For most applications, historical data meets the reliability criterion, but technical limitations remain, caused by the fact that a lot of data is being exchanged across networks at lightening speeds and with service lifetimes that are often reduced to the time it takes to download an app.

And here is where it gets interesting, as our observations suggest that the optimum level of customer satisfaction occurs at the “moments of truth” where the customer interacts with the service, a concept made famous by Jan Carlson of Scandinavian Airlines.

When it comes to communication networks, these moments of truth are occurring in real time and at very high speed. Put simply, this means that, whilst a great deal of effort can be expended on analyzing and understanding a customer’s service consumption history, the real measure of customer satisfaction is how well the service provider can satisfy customer needs at the moment of truth.

Let’s think about what this means for the underlying IT system. While the concept of big data is relatively easy to understand, the very term itself is likely to send shivers down the spine of the IT professional, for the simple reason that moving large volumes of data in real time means that one or more technology bottlenecks will be encountered.

These bottlenecks differ between organizations, but the central focus is that there needs to be real-time data analysis of customer service usage available to management in order that they can assemble the key performance indicators (KPI) that modern business planning now thrives on.

Questions that need to be answered include: Did the customer get the service they wanted and was it provided satisfactorily? Were there any delays or resends? Were there any issues with congestion that prevented the customer getting the service when they needed it as fast as they needed it?

The only way to collect and analyze this information is to complete the process in real time as the moment of truth unfolds.

The bottom line here is that capturing this information on customer service usage and network performance is the crucial front-end to understanding if the service delivery is living up to expectations. It’s important to understand that this information is not only useful for understanding the current situation, but can also be used to enhance the historical information that KPI projections are often based on.

By historical information, we mean data on which services customers are using, as well as when and for how long, so allowing pro-active service providers to change their service offering to better suit customers’ behaviour. From a technology perspective, this is the back-end we traditionally understand as supporting big data, but the essential front-end is real-time data collection on those crucial “moments of truth,” which, in the end, determine customer satisfaction.

Dan-Joe Barry is vice president of marketing with Napatech (

business analytics news and articles

Related Posts

  • 32
    September/October 2012 Opportunity to deliver real-time customer insights by harnessing the power of structured and unstructured data. By Rohit Tandon, Arnab Chakraborty and Ganga Ganapathi (left to right) The rise of the Internet has created terabytes of data or “Big Data” that is available to consumers and enterprises alike. This…
    Tags: data, customer, big
  • 32
    The Panama Papers, the unprecedented leak of 11.5 million files from the database of the global law firm Mossack Fonseca, opened up the offshore tax accounts of the rich, famous and powerful – laying bare how they have exploited secretive offshore tax regimes for decades.
    Tags: data, mining, big
  • 31
    International Data Corporation (IDC) recently released a worldwide Big Data technology and services forecast showing the market is expected to grow from $3.2 billion in 2010 to $16.9 billion in 2015. This represents a compound annual growth rate (CAGR) of 40 percent or about seven times that of the overall…
    Tags: data, big, services


Report: Automotive industry to invest $3.3 billion in big data in 2018

Big data investments in the automotive industry are expected to surpass $3.3 billion in 2018, according to a report by SNS Telecom & IT. Amid the proliferation of real-time and historical data from sources such as connected devices, web, social media, sensors, log files and transactional applications, big data is rapidly gaining traction from a diverse range of vertical sectors. Read more →

Digital vs. print books: playing favorites can hurt overall book sales

Since the first e-book platform launched in 2007, e-book sales grew to comprise 20 percent of all book sales by 2015. To ensure the increasing popularity of e-books do not undermine the success of their printed counterparts, publishers frequently delay the digital publication date for several weeks after the print edition has been released. Read more →



INFORMS Annual Meeting
Nov. 4-7, 2018, Phoenix


Making Data Science Pay
July 30-31, 12:30 p.m.-5 p.m.

Predictive Analytics: Failure to Launch Webinar
Aug. 18, 11 a.m.

Applied AI & Machine Learning | Comprehensive
Sept. 10-13, 17-20 and 24-25

Advancing the Analytics-Driven Organization
Sept. 17-20, 12-5 p.m. LIVE Online

The Analytics Clinic: Ensemble Models: Worth the Gains?
Sept. 20, 11 a.m. -12:30 p.m.


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to