Share with your friends


Analytics Magazine

Interaction analytics: Form from chaos

March/April 2015

Executive Edge

Interaction analytics: Form from chaos

Gaining insights from interaction-based unstructured data.

By Ryan Pellet

For a long time, businesses were in the dark when it came to their customers. The little amount of “row and column” data that they had didn’t provide nearly enough to understand the customers’ needs, much less align the products and supporting business processes to them. These days, the opposite is true. Businesses are flooded with huge amounts of data about their customers. Phone calls, emails, text messages, chats and even social media posts ensure that companies are inundated with information.

Customers are talking – a lot – and they are telling companies exactly what’s going right and exactly what’s going wrong. On the whole, companies are storing this information and are creating a very rich source of data waiting to be monetized. Today’s issue isn’t a lack of data; the issue is separating the customers’ intentions from the collected data’s noise to make sound business decisions.

The vast majority of data a company collects is unstructured. In fact, only 10 percent of an organization’s available data is in any sort of “row and column” form. The rest resides in the customer’s interactions with the company’s product, service functions or social ecosystem. Until recently this data has been too expensive to collect and manage, and has been largely inaccessible for pragmatic analysis at scale. Interaction data was collected, but then fell on the floor.

Advancements in neural phonetic speech analytics are solving this problem. Unstructured speech data is now being effectively generated, stored, organized, analyzed and actively used – sometimes in real time – in enterprise data analysis functions. The predominant reason that this form of data is being used is due to the business context contained in it. Once a pattern is found in the data using modeling, statistical testing, clustering and the like, it is easier to bridge the gap between the data and the necessary business decision. For everything found in the models, there are contextual examples that a business person can understand by simply accessing the actual conversations.

Interaction Analytics: Providing Much Needed Context

Unstructured data consists largely of interactions; it’s all those angry or pleasant customer phone calls, the thousands of emailed questions, the Facebook wall posts and tweets directed toward a company. Despite being told otherwise by big data experts, most business practitioners see unstructured data as largely useless. It’s a byproduct of engaging with customers, and little else. Most companies are armed with very few solutions for understanding the customer, and business people have typically looked to surveys to understand the interactions after they happened or they worked with their IT teams to create disposition codes in CRM systems to track basic call types.

The problem is that they were missing one of the most important parts of the calls: why the customer was calling. The pertinent point, and this is worth repeating, is that during those periods of engagement, customers are telling companies the exact portions of the product and/or supporting business processes that are working, and which parts are broken.

In order to make sense of all these interactions, the key is not just to listen, but also to listen intelligently. That’s where neural phonetic speech analytics comes into play. Advancements in technology have created the ability to effectively and efficiently codify conversations into machine-readable data. Using these advancements, it’s possible to see patterns in the interactions, to predict what is likely to happen and to assign contextual actions. Interaction data differentiates itself by always being one degree away from the root data source: the actual conversations. Data analysts and business practitioners now have the common language of interactions to utilize when deciding on what actions to take to improve the business. This common language speeds the process of turning insights into actions.

While it’s important to know the nuts and bolts of interaction analytics, it’s certainly helpful to see how companies are applying neural phonetic speech analytics in the real world. What follows are three real-world examples of companies structuring the seemingly unstructurable data in ways that are positively affecting business practices and bottom lines.

Listening for the Canary: Understand Emerging Trends and Conversational Patterns

In the early days of the mining industry, a common practice was for a miner to take a caged canary into the mine with him. The idea was that the bird, being more susceptible to harmful gases, would register danger and become sick before the humans, thus giving the humans a chance to escape. Unstructured yet context-rich data provides the same, albeit more humane, early warning system. Hidden in all those interactions are conversational patterns that in turn point toward emerging trends in a company’s customer base. By listening intelligently to the customer interactions, a company can proactively head off the worst of a situation instead of reacting too late to an issue that has already occurred.

A proactive response can be valuable in a variety of situations. During a company crisis such as a data breech, a negative/viral social media post or newly marketed competitive offer, a company that is able to quantitatively and intelligently listen to early calls from customers will be able to pick up on a variety of information. The proactive company can learn what specific concerns are being generated, whether the concerns are significant enough to cause customers to take their business elsewhere, and which resolutions have worked effectively to counteract the situation. Catching the concerns early allows for business practitioners to form a data-driven plan that effectively fields future calls as well as works toward solving the overarching issue that is at hand.

The early warning signs of an impending issue can be found not only in what the customers are saying, but how they are saying it. When people are agitated, they tend to speak faster and interrupt others. On the other hand, customers that are laughing during a call tend to be more relaxed and less likely to have an issue. In this sense, listening intelligently goes beyond simply understanding what the customers are saying and includes how they are feeling about their interaction. In using the overall sentiment of the call, a company can begin to predict levels of customer satisfaction and proactively build a strategy to fix the issues before they become full-blown disasters.

A big part of being proactive is, in essence, working to figure out what issues might be around the corner. Interaction analytics provides that glimpse into the future that so many companies need.

The Future Context: Using Interaction Analytics to Predict Events

Interaction data is highly predictive when it is consistently and accurately generated. Companies are using interaction analytics to find how seemingly disparate words and phrases of hundreds of millions of conversations fit together in broader contextual patterns and are using that context to begin looking toward the future. Simply put, people who buy, leave or stay say the same things. Having interactions in data form allows businesses to explore the patterns and trends related to outcomes. As companies become more in tune with how their conversations with customers are going, they will be able to better anticipate and head off future issues.

To begin using interaction data predictively, leading companies first look to their past to develop their predictive models through the combined use of two important variables – independent and dependent. Independent variables are the words that people say in their interactions with a company while dependent variables are the actions the customers take. To predict future churn, companies are using all of the data contained in the interactions and finding the links between the independent and dependent variables.

As a result, conversational patterns surface that show the relationships between people who stay and leave. These patterns are being used by business practitioners to identify customers that are following the same pattern. Once the model is up and running, companies are using the ongoing data to further mitigate future churn and to strengthen their model’s accuracy.

It’s through 100 percent interaction analysis that companies see patterns arise. A healthcare organization begins to recognize which calls most often lead to fraudulent claims, a communications company begins to see the interactions that end in cancelation, and so on. Once the patterns are understood, companies can put processes into place that head off the issues before they materialize. Understanding the importance of all that unstructured data allows companies to become far more proactive instead of fearfully waiting for the next fire to start before scrambling to put it out.

Getting Real: Reacting to Situations in Real Time

One of the more exciting advances in interaction analytics is the ability of neural phonetic speech analytics systems to “listen” to conversations in real time and react with necessary information. Systems have the ability to structure the data as it’s being created, and by comparing it to the predictive models mentioned above, allows companies to handle situations in real time. This real-time analysis is being achieved through a two-part process: triggers and workflow.

Triggers are a combination of words, phrases and acoustic properties – the words being used, how quickly someone is speaking or at what volume. As an example, companies are establishing triggers to alert supervisors every time a caller mentions a predictive phrase related to churn, such as “better deal.” Others are using sentiment triggers to escalate calls that are increasing in negative sentiment properties. Since these systems have been set up to “listen” for triggers, they prompt the appropriate action once they “hear” issues emerge.

That appropriate action is determined by the second part of the process, the workflow. Once the aforementioned system identifies the “better deal” trigger, the workflow kicks in to ensure the call is resolved in the best possible way by providing the agent with the most appropriate course of action; for example, prompting an offer or an explanation of the benefits over the competition. In some cases it recognizes that the customer is trying to game the system and has no intention of actually leaving. But how does the agent know what the best course of action is? It’s here that the combined power of interaction analytics really comes to light. While the call is in process, the system not only analyzes what the customer is saying in real time, but it also identifies the customer’s history, buying power and next best actions, all by actively using current and historical interaction data in real time.

Most companies are sitting on a largely untapped goldmine of information. Leading companies are using the data contained in their customer interactions to shape their companies’ futures. In its unstructured form, the data lacks context or relevance. However, using advancements in neural phonetic speech analytics, the customer-recording asset provides a pivotal key to measured understanding. By mining, interpreting and structuring all those conversations, interaction analytics can reveal evolving patterns, sentiments and relationships that are critical to success.

Ryan Pellet is chief strategy officer for Nexidia, a leading provider of customer interaction analytics solutions for business transformation. As an officer within Nexidia, Pellet is responsible for the creation, communication, execution and sustained alignment of Nexidia’s strategic growth plans. He is responsible for understanding market trends in the analytics market space, connecting these trends to Nexidia’s business initiatives and assisting in execution against these initiatives, globally.


    Using machine learning and optimization to improve refugee integration

    Andrew C. Trapp, a professor at the Foisie Business School at Worcester Polytechnic Institute (WPI), received a $320,000 National Science Foundation (NSF) grant to develop a computational tool to help humanitarian aid organizations significantly improve refugees’ chances of successfully resettling and integrating into a new country. Built upon ongoing work with an international team of computer scientists and economists, the tool integrates machine learning and optimization algorithms, along with complex computation of data, to match refugees to communities where they will find appropriate resources, including employment opportunities. Read more →

    Gartner releases Healthcare Supply Chain Top 25 rankings

    Gartner, Inc. has released its 10th annual Healthcare Supply Chain Top 25 ranking. The rankings recognize organizations across the healthcare value chain that demonstrate leadership in improving human life at sustainable costs. “Healthcare supply chains today face a multitude of challenges: increasing cost pressures and patient expectations, as well as the need to keep up with rapid technology advancement, to name just a few,” says Stephen Meyer, senior director at Gartner. Read more →

    Meet CIMON, the first AI-powered astronaut assistant

    CIMON, the world’s first artificial intelligence-enabled astronaut assistant, made its debut aboard the International Space Station. The ISS’s newest crew member, developed and built in Germany, was called into action on Nov. 15 with the command, “Wake up, CIMON!,” by German ESA astronaut Alexander Gerst, who has been living and working on the ISS since June 8. Read more →



    INFORMS Computing Society Conference
    Jan. 6-8, 2019; Knoxville, Tenn.

    INFORMS Conference on Business Analytics & Operations Research
    April 14-16, 2019; Austin, Texas

    INFORMS International Conference
    June 9-12, 2019; Cancun, Mexico

    INFORMS Marketing Science Conference
    June 20-22; Rome, Italy

    INFORMS Applied Probability Conference
    July 2-4, 2019; Brisbane, Australia

    INFORMS Healthcare Conference
    July 27-29, 2019; Boston, Mass.

    2019 INFORMS Annual Meeting
    Oct. 20-23, 2019; Seattle, Wash.

    Winter Simulation Conference
    Dec. 8-11, 2019: National Harbor, Md.


    Advancing the Analytics-Driven Organization
    Jan. 28–31, 2019, 1 p.m.– 5 p.m. (live online)


    CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

    For more information, go to