Share with your friends










Submit

Analytics Magazine

Forum: Neural networks and the case for efficient modeling

Jim TheologesBy Jim Theologes

High tech companies first introduced the idea of artificial intelligence (AI) into the mainstream, and now the term is a rapidly growing part of everyday business lexicon. Within the AI space, deep learning modeling and neural networks are receiving the most attention. Much like “big data,” the past decade’s tech buzzword du jour that left companies scrambling to understand and apply, neural networks and its cousin, deep learning, are said to be the “new bleeding edge” that can bring companies to technological salvation.

Yet with all the fervor and fanfare, many are still left wondering what a neural network actually is. Neural networks, or at least the idea of them, have been around since the 1940s.

These mathematical models were inspired by our biological understanding of how our brains function, using layers of neurons to process and pass information, and the algorithms that underlie neural networks are designed to mimic these functions. The recent resurgence in interest in neural networks is due to a number of factors, including dramatic advances in fundamental research in the field, exponential increases in computing power over time, and the emergence of AI in popular consumer devices such as Apple’s Siri and Amazon’s Alexa. This, along with the recent blast of marketing hype surrounding AI, has driven exploration of using neural networks and deep learning to address more traditional business questions – a new set of tools in the professional data scientist’s arsenal.

Recently, I came across an article discussing how analytical models in general are leveraged in the context of customer science. The article compared several types of models using IBM Watson Telco data to predict customer churn, and the most powerful model described was a logistic regression – a type of statistical model that is commonly used to predict a binary outcome, such as a “yes” or a “no.” I also found another article that leveraged the exact same customer churn data set, but instead used a multilayer perceptron, which is one of the base forms of a deep learning model. The deep learning article was written to show how a simple multilayer perceptron was more powerful than the models from the first article. And in fact, the deep learning model was able to achieve an accuracy of about 82 percent, while the logistic regression was able to achieve an accuracy of 80 percent. Statistical performance had improved. But which model would a business stakeholder choose?

We can start by asking whether a 2 percent difference is meaningful, given the business context of the model. In fields where accuracy is key, such as the medical field, that difference could be consequential. For a business question like customer churn, however, many would argue this isn’t as big of a difference. But statistical accuracy is not the only way to measure models against each other. I’d argue that we need at least two other evaluation criteria: the ease of deploying the model into a production environment, and the ease of explaining the model (“explainability”) and its results to business stakeholders. We need to consider a balance of all three criteria when selecting a “best” model to deploy in a business.

Finding the right combination brings to mind the concept of efficiency in economics – striving for the perfect balance of optimization for each measure to the point where further improving one measure would begin to harm another. Perhaps then an appropriate term for this model measurement technique could be “efficient modeling.” A modeler should start with the simplest model appropriate for the business question being asked, and then iteratively improve that model until it has reached “efficiency” – an optimal level of predictive power, explainability and ease of production.

In the second part of this series on neural networks, we’ll return to our competing models and review them in more depth through our “efficient modeling” lens. We’ve already seen that for this example, predictive power tilts in favor of the deep learning model. But what about ease of deployment? And explainability to business users? We’ll explore the implications of these criteria on model selection in the next article.

Jim Theologes is a data scientist at Elicit, a customer science and technology consultancy, where he spends his days building insights from data for a wide variety of industries including retail, software security, short-term rental and aerospace. His foremost goal is distilling simple understanding from complex data to drive actionable insights.

Analytics data science news articles

Related Posts

  • 63
    “Drive thy business or it will drive thee.” Benjamin Franklin offered this sage advice in the 18th century, but he left one key question unanswered: How? How do you successfully drive a business? More specifically, how do you develop the business strategy drivers that incite a business to grow and…
    Tags: data, business
  • 51
    With the rise of big data – and the processes and tools related to utilizing and managing large data sets – organizations are recognizing the value of data as a critical business asset to identify trends, patterns and preferences to drive improved customer experiences and competitive advantage. The problem is,…
    Tags: data
  • 49
    Companies and enterprises love to be called “data-driven.” Their business intelligence teams don’t refrain from including it in their strategy. But do they really know the meaning of being a “data-driven organization”? Though organizations succeed in generating and collecting a humongous amount of customer data, having big databases does not…
    Tags: data, customer
  • 47
    Capgemini a provider of consulting, technology and outsourcing services, and Oracle recently announced the launch of Business Analytics for Telecommunications, strengthening Capgemini’s global Business Information Management Service Line.
    Tags: business, customer, data, model, companies
  • 45
    The Internet of Things (IoT) is considered to be the next revolution that touches every part of our daily life, from restocking ice cream to warning of pollutants. Analytics professionals understand the importance of data, especially in a complicated field such as healthcare. This article offers a framework on integrating…
    Tags: data

Headlines

Using machine learning and optimization to improve refugee integration

Andrew C. Trapp, a professor at the Foisie Business School at Worcester Polytechnic Institute (WPI), received a $320,000 National Science Foundation (NSF) grant to develop a computational tool to help humanitarian aid organizations significantly improve refugees’ chances of successfully resettling and integrating into a new country. Built upon ongoing work with an international team of computer scientists and economists, the tool integrates machine learning and optimization algorithms, along with complex computation of data, to match refugees to communities where they will find appropriate resources, including employment opportunities. Read more →

Gartner releases Healthcare Supply Chain Top 25 rankings

Gartner, Inc. has released its 10th annual Healthcare Supply Chain Top 25 ranking. The rankings recognize organizations across the healthcare value chain that demonstrate leadership in improving human life at sustainable costs. “Healthcare supply chains today face a multitude of challenges: increasing cost pressures and patient expectations, as well as the need to keep up with rapid technology advancement, to name just a few,” says Stephen Meyer, senior director at Gartner. Read more →

Meet CIMON, the first AI-powered astronaut assistant

CIMON, the world’s first artificial intelligence-enabled astronaut assistant, made its debut aboard the International Space Station. The ISS’s newest crew member, developed and built in Germany, was called into action on Nov. 15 with the command, “Wake up, CIMON!,” by German ESA astronaut Alexander Gerst, who has been living and working on the ISS since June 8. Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

INFORMS Computing Society Conference
Jan. 6-8, 2019; Knoxville, Tenn.

INFORMS Conference on Business Analytics & Operations Research
April 14-16, 2019; Austin, Texas

INFORMS International Conference
June 9-12, 2019; Cancun, Mexico

INFORMS Marketing Science Conference
June 20-22; Rome, Italy

INFORMS Applied Probability Conference
July 2-4, 2019; Brisbane, Australia

INFORMS Healthcare Conference
July 27-29, 2019; Boston, Mass.

2019 INFORMS Annual Meeting
Oct. 20-23, 2019; Seattle, Wash.

Winter Simulation Conference
Dec. 8-11, 2019: National Harbor, Md.

OTHER EVENTS

Advancing the Analytics-Driven Organization
Jan. 28–31, 2019, 1 p.m.– 5 p.m. (live online)

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.