Share with your friends


Analytics Magazine

Predictive talent acquisition strategy

As the cost of failed new hires grows, so does the importance of pre-hire assessments, so how do you find the right talent assessment vendor? Ask these questions.

Greta Roberts is CEO of Talent Analytics, Corp., which helps to solve employee performance & attrition challenges through its predictive talent analytics software platform.By Greta Roberts

Over the past 30+ years, businesses have spent billions on talent assessments. Many of these are now being used to understand job candidates. Increasingly, businesses are asking how (or if) a predictive talent acquisition strategy can include the use of pre-hire assessments? As costs of failed new hires continue to rise, recruiters and hiring managers are looking for any kind of pre-hire information to increase the probability of making a great hire.

Despite the marketing hype, all predictive analytics projects have three very simple steps:

  1. A system reads “input” data –– perhaps assessment scores or CV information.
  2. The system does some math to apply a “predictive model” to the input data.
  3. The results of the model are shown as “output” data of the model, perhaps the likelihood of the candidate achieving a certain level of sales performance or another key performance indicator (KPI). At heart, it takes “inputs” and turns them into “outputs” or predicted business outcomes. But to build and validate a model, you need a healthy, logical set of both input and output data for that role in your company.

If you are using a talent assessment alone, this is just input data. To be predictive you need to include the other two steps.

talent assessments, predictive talent acquisition strategy, predictive modeling, key performance indicator

For most companies, current pre-hire talent assessments are wasted data.
Photo Courtesy of | rawpixel

For most companies, current pre-hire talent assessments are wasted data. Results are delivered in an individual report that often cannot be analyzed or aggregated. For most “legacy” talent assessments, it’s difficult or impossible to determine what positive (or negative) business affect the assessments are having. It often comes down to the question of “how much the HR person believes the results” vs. “how much the business is able to document and realize real results.”

It can be daunting to figure out what solutions will actually deliver a predictive solution. To help, here are some important questions to ask of your talent assessment vendor:

Consider the predictive company itself. Are you dealing with an assessment company who is trying to learn how to be predictive? Or is it a predictive company that also uses assessment data? How long have they been doing predictive work?

Consider the predictive team. Ideally the company will have data scientists on staff as well as industrial organizational (IO) psychologists. This is important because data scientists tend to utilize more modern and rigorous methods for prediction and validation. IO psychologists tend to be focused on the instrument, while data scientists tend to be concerned with predictive validity and business results.

Are they predicting for your company or for everyone? Some companies create “industry benchmarks,” that is, general performance predictions for general industry categories such as retail sales or customer service. These predictions are significantly less accurate, because they are based on companies different from your own, with different cultures, goals and regions. Not all “customer service” is the same. Modern computing methods enable leading providers to create and validate predictive models for your roles in your own company alone, and to continuously update the model over time.

Do they care about your outcome data? Generally, these solutions predict attrition or performance for a candidate or employee. Has the assessment company asked you for the attrition or KPI data for your employees in your target role? If they don’t know your employee outcomes, how can they predict your outcomes? They can’t. Most job roles have multiple KPIs that describe performance. Do they predict each of these separately? For KPIs that naturally contradict each other, e.g., speed vs. accuracy, how does the predictive solution resolve the contradiction? Just getting a “green light” isn’t good enough in many cases. What sample size did they ask for? Real predictions require a reasonable sample to properly validate that you aren’t being fooled by randomness. If they only ask for 15 top performers, your sample is too small to create a real prediction.

Does the solution base predictions on outcome data or a job fit, job match or job blueprint survey? Data science predicts what you ask it to predict. If you want lower attrition or higher KPIs, the models must be trained and validated with those data alone. The process looks for fact-based patterns to drive your business.

Surprisingly, many solutions don’t use this approach, but fall back to managerial bias. These solutions ask well-meaning committees of managers to list competencies that they believe are needed for success in a role. The resulting criteria are not predictive at all; they just find candidates that match the laundry list of beliefs and biases held by that committee. Nowhere in this process is a connection to actual attrition or KPI outcomes. Again, if the system doesn’t know about your outcomes, how can the process predict them? Start with data, not bias.

Does the solution use machine learning to recalibrate your predictive models? How often? Business needs, role descriptions and culture changes over time. Local labor conditions change. For example, service representatives may be incentivized to cross-sell related products, or new regulations may require new compliance to be performed. It is important to update and re-validate your predictive models two to four times a year to keep up to date with seen and unseen trends. Some solutions have not changed their models for 30 years; do you expect these to find great sales reps for you?

The new validation question: criterion validation? HR has been taught to ask if the assessment is validated. The first level of validation checks whether the assessment measures are self-consistent. Continue to ask this question.

But ultimately you care about whether the assessment feeds predictions that accurately correspond to improved business outcomes. That is, are the predictions actually working? This level is called “criterion validation” and is a high bar that is not commonly reached by vendors.

A top tier predictive talent assessment vendor will perform criterion validation for the solutions several times a year. Criterion validation is the highest level of validation possible, and is the most preferred by regulatory agencies.

Can you easily access/download your company’s talent assessment data? Talent assessment data is a critical data set for your company. If your talent assessment vendor makes it difficult or impossible to access your talent assessment data, this is a good indication they are using pre-predictive technology and that they don’t appreciate that this data is your asset.

True predictive solutions know that your workforce data scientists will want to use your talent assessment data to find correlations and predictions in many areas of your business. You need to insist on easy and direct access to the raw assessment scores.

How easy is it to deploy the solution into the talent acquisition process and use the predictions? How much training is required? Do your talent acquisition professionals need to read long text reports or get out a calculator to use the predictions? The complexity of a prediction should be kept out of the way of daily operations. If your team still needs to “think” about what the answer is, it is probably not a predictive solution.

Is there a different assessment for every role? Or one assessment with multiple predictive models? Multiple assessments make it impossible to predict one candidate’s performance against multiple roles. This may also be a signal that you are working with an older, legacy (less predictive) talent assessment supplier.

Is there an answer key for their solution on the web? For many assessments, there are answer keys and guides on how to fool or pass the test, which means two things: 1) the test is easily fooled, lacking internal controls to prevent spoofing, and 2) you are looking at an “industry benchmark” with one clear set of answers.

A data science-driven model would be custom to your role in your company, continuously evolving and therefore very difficult for answer keys and spoofing to catch.

Ask to see the company policy on employee predictive modeling, discrimination, disparate impact and fairness. It is important that a predictive solution has thought through the specific outcomes of its models and how they fit into creating fair opportunity for all applicants. In particular, it is vital for the solution to satisfy or exceed any government requirements for hiring and selection.

Do your own (internal) data scientists approve of the predictive solution? Ask one of your own data scientists (from HR, marketing or another area inside your own company) to accompany you in your evaluation. They should know what is a rigorous approach and what is marketing fluff.

How does the predictive solution regularly prove to you that the models are working? Ideally the company you select will be able to show you two to four times a year how your predictions are working (i.e., turnover is going down, sales are going up, calls are going up, errors are going down, etc.).

Greta Roberts is CEO of Talent Analytics, Corp., which helps to solve employee performance & attrition challenges through its predictive talent analytics software platform.

business analytics news and articles



Related Posts

  • 63
    Eric Siegel, founder of Predictive Analytics World (PAW, a series of conferences held throughout the year in major U.S. and European cites) and the author of the new book, “Predictive Analytics: The Power to Predict who will Click, Buy, Lie or Die,” is without question a key player in the…
    Tags: analytics, predictive, data
  • 57
    Use of the term “business analytics” is being used within the information technology industry to refer to the use of computing to gain insight from data. The data may be obtained from a company’s internal sources, such as its enterprise resource planning application, data warehouses/marts, from a third party data…
    Tags: analytics, data, business, predictive
  • 55
    Cathy O’Neil’s provocative book, “Weapons of Math Destruction: How big data increases inequality and threatens democracy,” created quite a stir in the analytics community when it was released last fall.
    Tags: analytics, data, models, predictive
  • 54
    Features Why optimization models fail By Patricia Randall Supply chain and manufacturing: How to avoid chaos in the field by combining simulation and real-time optimization. AI: Path to an intelligent enterprise By Joseph Byrum Imagine a future guided by artificial intelligence: Augmenting human decision-making at the enterprise level to generate…
    Tags: analytics, data, predictive, human, models, business
  • 54
    Hewlett-Packard (HP) knows there are two sides to every coin. The company has achieved new power by predicting employee behavior, a profitable practice that may raise eyebrows among some of its staff. HP tags its more than 330,000 workers with a so-called Flight Risk score. This simple number foretells whether…
    Tags: predictive, analytics, data


Using machine learning and optimization to improve refugee integration

Andrew C. Trapp, a professor at the Foisie Business School at Worcester Polytechnic Institute (WPI), received a $320,000 National Science Foundation (NSF) grant to develop a computational tool to help humanitarian aid organizations significantly improve refugees’ chances of successfully resettling and integrating into a new country. Built upon ongoing work with an international team of computer scientists and economists, the tool integrates machine learning and optimization algorithms, along with complex computation of data, to match refugees to communities where they will find appropriate resources, including employment opportunities. Read more →

Gartner releases Healthcare Supply Chain Top 25 rankings

Gartner, Inc. has released its 10th annual Healthcare Supply Chain Top 25 ranking. The rankings recognize organizations across the healthcare value chain that demonstrate leadership in improving human life at sustainable costs. “Healthcare supply chains today face a multitude of challenges: increasing cost pressures and patient expectations, as well as the need to keep up with rapid technology advancement, to name just a few,” says Stephen Meyer, senior director at Gartner. Read more →

Meet CIMON, the first AI-powered astronaut assistant

CIMON, the world’s first artificial intelligence-enabled astronaut assistant, made its debut aboard the International Space Station. The ISS’s newest crew member, developed and built in Germany, was called into action on Nov. 15 with the command, “Wake up, CIMON!,” by German ESA astronaut Alexander Gerst, who has been living and working on the ISS since June 8. Read more →



INFORMS Computing Society Conference
Jan. 6-8, 2019; Knoxville, Tenn.

INFORMS Conference on Business Analytics & Operations Research
April 14-16, 2019; Austin, Texas

INFORMS International Conference
June 9-12, 2019; Cancun, Mexico

INFORMS Marketing Science Conference
June 20-22; Rome, Italy

INFORMS Applied Probability Conference
July 2-4, 2019; Brisbane, Australia

INFORMS Healthcare Conference
July 27-29, 2019; Boston, Mass.

2019 INFORMS Annual Meeting
Oct. 20-23, 2019; Seattle, Wash.

Winter Simulation Conference
Dec. 8-11, 2019: National Harbor, Md.


Advancing the Analytics-Driven Organization
Jan. 28–31, 2019, 1 p.m.– 5 p.m. (live online)


CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:

For more information, go to