Share with your friends










Submit

Analytics Magazine

Viewpoint: Understanding the challenges and opportunities of big data

July/August 2012

Dan-Joe BarryBy Dan-Joe Barry

You’ve probably heard a lot about dig data, largely as a result of the fact that the technology – in the shape of ultra-fast processors and data interfacing systems – has come of age, meaning that companies can harness the power that big data brings to the better technology table.

However, plenty of confusion persists about what dig data is and how it helps the average hard-pressed company professional.

At its most basic, big data is an umbrella term for any pro-active use of available data for the purposes of improving services and customer satisfaction. In this context, a major focus has been placed on data warehousing and data mining for better analytics on a company’s customers, as well as their product or service consumption.

The underlying premise is that the data required for analysis is available to the company concerned and is in a format that is easily accessible. The data should also, of course, be reliable enough to support analytics.

But wait – as the TV advert says – there’s more, as big data generates information that analysts call BI (business intelligence), which, unlike the raw materials used in manufacturing processes, can be used, re-used and re-used again.

BI is now a must-have feature of modern management. A 2011 IBM survey found that 83 percent of chief information officers view BI as their top priority for enhancing competitiveness.

Until just a few years ago, businesses tended to limit and even block the data they supplied to people outside of their day-to-day environment and bring information inside. But the arrival of big data – and the raw BI it generates – allows companies to do the reverse and share their inside data with customers and see what they do with it.

This is, in essence, how the more efficient businesses communicate with their customers on social networking site and services such as Facebook and LinkedIn.

As a result, many organizations are finding that a high percentage of BI now resides outside the structured environment, meaning that businesses have to change the methodology by which they get data, which can pose significant technical challenges. Assuming these technical challenges can be overcome and the underlying big data supporting the organisation’s information resource is reliable, then we can start to crunch the available information.

For most applications, historical data meets the reliability criterion, but technical limitations remain, caused by the fact that a lot of data is being exchanged across networks at lightening speeds and with service lifetimes that are often reduced to the time it takes to download an app.

And here is where it gets interesting, as our observations suggest that the optimum level of customer satisfaction occurs at the “moments of truth” where the customer interacts with the service, a concept made famous by Jan Carlson of Scandinavian Airlines.

When it comes to communication networks, these moments of truth are occurring in real time and at very high speed. Put simply, this means that, whilst a great deal of effort can be expended on analyzing and understanding a customer’s service consumption history, the real measure of customer satisfaction is how well the service provider can satisfy customer needs at the moment of truth.

Let’s think about what this means for the underlying IT system. While the concept of big data is relatively easy to understand, the very term itself is likely to send shivers down the spine of the IT professional, for the simple reason that moving large volumes of data in real time means that one or more technology bottlenecks will be encountered.

These bottlenecks differ between organizations, but the central focus is that there needs to be real-time data analysis of customer service usage available to management in order that they can assemble the key performance indicators (KPI) that modern business planning now thrives on.

Questions that need to be answered include: Did the customer get the service they wanted and was it provided satisfactorily? Were there any delays or resends? Were there any issues with congestion that prevented the customer getting the service when they needed it as fast as they needed it?

The only way to collect and analyze this information is to complete the process in real time as the moment of truth unfolds.

The bottom line here is that capturing this information on customer service usage and network performance is the crucial front-end to understanding if the service delivery is living up to expectations. It’s important to understand that this information is not only useful for understanding the current situation, but can also be used to enhance the historical information that KPI projections are often based on.

By historical information, we mean data on which services customers are using, as well as when and for how long, so allowing pro-active service providers to change their service offering to better suit customers’ behaviour. From a technology perspective, this is the back-end we traditionally understand as supporting big data, but the essential front-end is real-time data collection on those crucial “moments of truth,” which, in the end, determine customer satisfaction.

Dan-Joe Barry is vice president of marketing with Napatech (www.napatech.com).

business analytics news and articles

Related Posts

  • 32
    September/October 2012 Opportunity to deliver real-time customer insights by harnessing the power of structured and unstructured data. By Rohit Tandon, Arnab Chakraborty and Ganga Ganapathi (left to right) The rise of the Internet has created terabytes of data or “Big Data” that is available to consumers and enterprises alike. This…
    Tags: data, customer, big
  • 32
    The Panama Papers, the unprecedented leak of 11.5 million files from the database of the global law firm Mossack Fonseca, opened up the offshore tax accounts of the rich, famous and powerful – laying bare how they have exploited secretive offshore tax regimes for decades.
    Tags: data, mining, big
  • 31
    International Data Corporation (IDC) recently released a worldwide Big Data technology and services forecast showing the market is expected to grow from $3.2 billion in 2010 to $16.9 billion in 2015. This represents a compound annual growth rate (CAGR) of 40 percent or about seven times that of the overall…
    Tags: data, big, services

Headlines

Using machine learning and optimization to improve refugee integration

Andrew C. Trapp, a professor at the Foisie Business School at Worcester Polytechnic Institute (WPI), received a $320,000 National Science Foundation (NSF) grant to develop a computational tool to help humanitarian aid organizations significantly improve refugees’ chances of successfully resettling and integrating into a new country. Built upon ongoing work with an international team of computer scientists and economists, the tool integrates machine learning and optimization algorithms, along with complex computation of data, to match refugees to communities where they will find appropriate resources, including employment opportunities. Read more →

Gartner releases Healthcare Supply Chain Top 25 rankings

Gartner, Inc. has released its 10th annual Healthcare Supply Chain Top 25 ranking. The rankings recognize organizations across the healthcare value chain that demonstrate leadership in improving human life at sustainable costs. “Healthcare supply chains today face a multitude of challenges: increasing cost pressures and patient expectations, as well as the need to keep up with rapid technology advancement, to name just a few,” says Stephen Meyer, senior director at Gartner. Read more →

Meet CIMON, the first AI-powered astronaut assistant

CIMON, the world’s first artificial intelligence-enabled astronaut assistant, made its debut aboard the International Space Station. The ISS’s newest crew member, developed and built in Germany, was called into action on Nov. 15 with the command, “Wake up, CIMON!,” by German ESA astronaut Alexander Gerst, who has been living and working on the ISS since June 8. Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

INFORMS Computing Society Conference
Jan. 6-8, 2019; Knoxville, Tenn.

INFORMS Conference on Business Analytics & Operations Research
April 14-16, 2019; Austin, Texas

INFORMS International Conference
June 9-12, 2019; Cancun, Mexico

INFORMS Marketing Science Conference
June 20-22; Rome, Italy

INFORMS Applied Probability Conference
July 2-4, 2019; Brisbane, Australia

INFORMS Healthcare Conference
July 27-29, 2019; Boston, Mass.

2019 INFORMS Annual Meeting
Oct. 20-23, 2019; Seattle, Wash.

Winter Simulation Conference
Dec. 8-11, 2019: National Harbor, Md.

OTHER EVENTS

Advancing the Analytics-Driven Organization
Jan. 28–31, 2019, 1 p.m.– 5 p.m. (live online)

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.