Share with your friends


Analytics Magazine

Executive Edge: Cognitive Computing: Five “I wish I would haves” to Avoid

Paul Roma

By Paul Roma

Let’s talk about cognitive computing. After all, everybody else is, right? In fact, there’s so much chatter about cognitive, in the worlds of both academia and business, that your instincts probably tell you to ignore it as much as possible – to let everything cool down a bit so that we can recognize it for what it really is.

But there’s just one problem with that approach.

Cognitive computing is already truly huge, and it’s only expected to get bigger. I don’t say that lightly. I’ve seen plenty of next-big-thing flameouts, just like anyone who’s been engaged in the analytics world for years. But today, when it comes to cognitive, in my work with businesses around the world, I’m seeing signals that remind me not of the flameouts, but of the truly monumental advances made in technology. Remember that moment when we all realized that there were as many mobile devices in the world as there were people? Or, going way back, when it became clear that the Internet was not solely the domain of government researchers, and had serious commercial applications? That’s the kind of moment we’re experiencing right now with cognitive computing.

cognitive computingThink about it. Computing capabilities are unbelievably strong today. There’s a greater discipline in algorithms than we’ve ever seen. Data storage costs, what, around 3 cents to store a gig of data today? Put it all together, and you realize that whatever we’ve done in cognitive computing today will soon be considered quaint early indicators of the seismic changes that follow. We are heading down an exponential change curve.

Because cognitive computing is already a burgeoning reality among the businesses I work with every day, I’ve already observed a few serious risky views on it. Why are they risky? Because if they take hold, they’re likely to lead many to say, “I wish I would have” in the not-so-distant future. And in this case, the implications of getting it wrong, or simply not getting on board fast enough, could be serious. Don’t let yourself get caught saying these things a year from now:

“I wish I would have known what cognitive can really do.” What if you realized too late that cognitive can enable the quantification of historically qualified domains? Or that it could amplify your analysis of existing problems? You’d miss out on a ton of potential. Because cognitive capabilities should be able to combine hard facts – how much something costs, how long it took to manufacture, when it was delivered – with adjacent interaction data from social media, customer service, surveys, you name it. All in order to generate scores for sentiment, buying patterns, bundling patterns, context for change and more.

When it comes to amplifying analysis, consider that text, voice and video data (for example) can cast sentiment analysis, behavior patterns and human decision-making in an entirely new – and more accurate – light.

“I wish I would have known cognitive wasn’t 10 years away.” Cognitive computing is here now. Let me repeat: It has arrived. It’s not 10 years away. It’s not even 10 days away. It’s now. Maybe you’re not doing anything with it today – and maybe that’s OK. But at the same time, you have to account for its presence. That’s where I see clients at risk of missing the boat – they assume that because they don’t need a tangible cognitive strategy today, they can just sit this one out until they do. In reality, they should be planning for it. Can you imagine knowing that virtually everyone in your organization owned a mobile phone, while continuing with business plans as if mobile phones basically didn’t exist? Or, can you imagine the possibilities had every business been aware of web and cloud capabilities? Of course not. But that’s exactly what some are doing when it comes to cognitive.

“I wish I would have known cognitive wasn’t an ‘edge’ technology.” What do you do when a new technology appears on the scene quickly? You likely deploy it in the margins – give it a test run in some pesky part of the business, see how it works, expand, move to another area, and so on. Wash, rinse, repeat. That’s the definition of “edge” technology. Unfortunately, an edge mindset is exactly wrong in this environment. Cognitive computing is no more of an edge technology than mobility or cloud computing – it is (or should be) a pervasive technology embedded at the core of the business.

Today, I’m seeing this realization slowly take hold among CTOs and others who have approached cognitive as a stand-alone investment – off-property, handled by a third party, just a piece of a larger machine. They’re increasingly concerned about the potential for a competitor to emerge with cognitive computing capabilities driving their ability to provide better service, higher performance, smarter supply chains, you name it. The good news is that we’re still at the front end of cognitive; there’s time to change direction. But that window will probably close faster than anyone expects.

“I wish I would have known how to start – and when to scale.” Train and learn, or build and develop? Specific purpose or general purpose? These are the types of getting-started questions that will ultimately help shape an organization’s entire approach to cognitive. They should not be taken lightly, which is why many feel a sort of paralysis at the moment it’s time to get underway.

Begin with perceived accuracy problems – not technical accuracy. These are the problems that are rooted in opinion rather than technical or statistical correctness. From there, move on to apprentice roles – the next scoring mechanism. These are typically defined by guiding principles and rules of thumb, but are not necessarily fact-based rules that are set in stone.

Rule-based problems are also ripe targets for cognitive efforts. In this case, the problems are knowable, but the task of actually maintaining the rules is too hard, so it’s possible to enable computing models to learn the rules along the way.

As you progress down these paths, extending into new domains and complexities can make the models more useful in a cognitive environment. In a more traditional system, meanwhile, adding new dimensions means a rewrite. Cognitive systems are capable of creating new relationships and learning how the next domain correlates to older ones, which strengthens the model.

“I wish we would have planned for future changes.” Cognitive capabilities are typically delivered in the form of flexible technologies that can adapt and learn, but they can tend to learn in “straight lines” following the example of the humans that guide them. After all, we are creatures of habit, and our habits are linear. In a cognitive environment, it’s possible to break this paradigm by combining adjacent data domains to give your models perspective, in much the same way that you might guide a child learning about the world. This is important because effectively designing for the future requires a multi-dimensional approach – one that incorporates a multitude of perspectives. This sort of future-proofing isn’t simply a challenge of computer science or engineering. It demands a fuller depth of understanding and context – the ability to view a situation through many dimensions at the same time. In many cases, these are dimensions that humans can’t even see. And that is the beauty of what’s next in cognitive.

Are these the only questions we may wish we had better answers for in the future? Almost certainly not. Expect more twists and turns in the road to cognitive ahead. But at the same time, these are all legitimate, known issues. It’s just that they’re clearer to some leaders than to others – a fact that will become painfully obvious as cognitive continues down its restless, surprising, high-stakes path.

Paul Roma, chief analytics officer of U.S. Deloitte, directs the company’s analytics offerings across all businesses.



Related Posts

  • 96
    Recent analytics news...
    Tags: computing, cognitive
  • 43
    The public cloud market will grow rapidly to $236 billion in 2020, up 23 percent from 2014, according to Forrester’s latest forecast. The growth will be driven by a stronger market for cloud applications like software-as-a-service (SaaS) and an increased demand for cloud platforms such as infrastructure-as-a-service (IaaS) and platform-as-a-service…
    Tags: computing
  • 43
    Almost every company is using the cloud, but not for everything, finds a new IBM study. Seventy-eight percent of respondents say their cloud initiatives are coordinated or fully integrated, compared to just 34 percent in 2012. And, among high-performing organizations, that number climbs to 83 percent. At the same time,…
    Tags: computing
  • 43
    So many problems, so many analytics-oriented solutions. So where do we start? How about at the beginning, which, according to Maria in the “Sound of Music,” is a very good place to start. Leading off in his “Executive Edge” column, Paul Roma, U.S. Deloitte analytics market leader, delves into the…
    Tags: cognitive, computing
  • 39
    If your customer relationship management (CRM) system could actually think, would Elon Musk and other AI detractors want to kill it?
    Tags: computing, cognitive


Challenges facing supply chain execs: leadership, labor, legacy technology

While most companies recognize the value of a digitally enabled supply chain – empowered by new technologies like artificial intelligence, blockchain, big data and analytics – many chief supply chain officers (CSCOs) are not leveraging their C-suite counterparts to help reinvent the supply chain function and transform it into an engine of new growth models and customer experiences, according to new research from Accenture. Read more →

Data Science Bowl: Using AI to accelerate life-saving medical research

Imagine unleashing the power of artificial intelligence to automate a critical component of biomedical research, expediting life-saving research in the treatment of almost every disease from rare disorders to the common cold. This could soon be a reality, thanks to the fourth Data Science Bowl, a 90-day competition in which, for the first time, participants trained deep learning models to examine images of cells and identify nuclei, regardless of the experimental setup – and without human intervention. Read more →



INFORMS International Conference
June 17-20, 2018, Taipei, Taiwan

INFORMS Annual Meeting
Nov. 4-7, 2018, Phoenix


Advancing the Analytics-Driven Organization
July 16-19, noon-5 p.m.

Making Data Science Pay
July 30-31, 12:30 p.m.-5 p.m.

Predictive Analytics: Failure to Launch Webinar
Aug. 18, 11 a.m.

Applied AI & Machine Learning | Comprehensive
Sept. 10-13, 17-20 and 24-25


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to