Share with your friends










Submit

Analytics Magazine

Predictive talent acquisition strategy

As the cost of failed new hires grows, so does the importance of pre-hire assessments, so how do you find the right talent assessment vendor? Ask these questions.

Greta Roberts is CEO of Talent Analytics, Corp., which helps to solve employee performance & attrition challenges through its predictive talent analytics software platform.By Greta Roberts

Over the past 30+ years, businesses have spent billions on talent assessments. Many of these are now being used to understand job candidates. Increasingly, businesses are asking how (or if) a predictive talent acquisition strategy can include the use of pre-hire assessments? As costs of failed new hires continue to rise, recruiters and hiring managers are looking for any kind of pre-hire information to increase the probability of making a great hire.

Despite the marketing hype, all predictive analytics projects have three very simple steps:

  1. A system reads “input” data –– perhaps assessment scores or CV information.
  2. The system does some math to apply a “predictive model” to the input data.
  3. The results of the model are shown as “output” data of the model, perhaps the likelihood of the candidate achieving a certain level of sales performance or another key performance indicator (KPI). At heart, it takes “inputs” and turns them into “outputs” or predicted business outcomes. But to build and validate a model, you need a healthy, logical set of both input and output data for that role in your company.

If you are using a talent assessment alone, this is just input data. To be predictive you need to include the other two steps.

talent assessments, predictive talent acquisition strategy, predictive modeling, key performance indicator

For most companies, current pre-hire talent assessments are wasted data.
Photo Courtesy of 123rf.com | rawpixel

For most companies, current pre-hire talent assessments are wasted data. Results are delivered in an individual report that often cannot be analyzed or aggregated. For most “legacy” talent assessments, it’s difficult or impossible to determine what positive (or negative) business affect the assessments are having. It often comes down to the question of “how much the HR person believes the results” vs. “how much the business is able to document and realize real results.”

It can be daunting to figure out what solutions will actually deliver a predictive solution. To help, here are some important questions to ask of your talent assessment vendor:

Consider the predictive company itself. Are you dealing with an assessment company who is trying to learn how to be predictive? Or is it a predictive company that also uses assessment data? How long have they been doing predictive work?

Consider the predictive team. Ideally the company will have data scientists on staff as well as industrial organizational (IO) psychologists. This is important because data scientists tend to utilize more modern and rigorous methods for prediction and validation. IO psychologists tend to be focused on the instrument, while data scientists tend to be concerned with predictive validity and business results.

Are they predicting for your company or for everyone? Some companies create “industry benchmarks,” that is, general performance predictions for general industry categories such as retail sales or customer service. These predictions are significantly less accurate, because they are based on companies different from your own, with different cultures, goals and regions. Not all “customer service” is the same. Modern computing methods enable leading providers to create and validate predictive models for your roles in your own company alone, and to continuously update the model over time.

Do they care about your outcome data? Generally, these solutions predict attrition or performance for a candidate or employee. Has the assessment company asked you for the attrition or KPI data for your employees in your target role? If they don’t know your employee outcomes, how can they predict your outcomes? They can’t. Most job roles have multiple KPIs that describe performance. Do they predict each of these separately? For KPIs that naturally contradict each other, e.g., speed vs. accuracy, how does the predictive solution resolve the contradiction? Just getting a “green light” isn’t good enough in many cases. What sample size did they ask for? Real predictions require a reasonable sample to properly validate that you aren’t being fooled by randomness. If they only ask for 15 top performers, your sample is too small to create a real prediction.

Does the solution base predictions on outcome data or a job fit, job match or job blueprint survey? Data science predicts what you ask it to predict. If you want lower attrition or higher KPIs, the models must be trained and validated with those data alone. The process looks for fact-based patterns to drive your business.

Surprisingly, many solutions don’t use this approach, but fall back to managerial bias. These solutions ask well-meaning committees of managers to list competencies that they believe are needed for success in a role. The resulting criteria are not predictive at all; they just find candidates that match the laundry list of beliefs and biases held by that committee. Nowhere in this process is a connection to actual attrition or KPI outcomes. Again, if the system doesn’t know about your outcomes, how can the process predict them? Start with data, not bias.

Does the solution use machine learning to recalibrate your predictive models? How often? Business needs, role descriptions and culture changes over time. Local labor conditions change. For example, service representatives may be incentivized to cross-sell related products, or new regulations may require new compliance to be performed. It is important to update and re-validate your predictive models two to four times a year to keep up to date with seen and unseen trends. Some solutions have not changed their models for 30 years; do you expect these to find great sales reps for you?

The new validation question: criterion validation? HR has been taught to ask if the assessment is validated. The first level of validation checks whether the assessment measures are self-consistent. Continue to ask this question.

But ultimately you care about whether the assessment feeds predictions that accurately correspond to improved business outcomes. That is, are the predictions actually working? This level is called “criterion validation” and is a high bar that is not commonly reached by vendors.

A top tier predictive talent assessment vendor will perform criterion validation for the solutions several times a year. Criterion validation is the highest level of validation possible, and is the most preferred by regulatory agencies.

Can you easily access/download your company’s talent assessment data? Talent assessment data is a critical data set for your company. If your talent assessment vendor makes it difficult or impossible to access your talent assessment data, this is a good indication they are using pre-predictive technology and that they don’t appreciate that this data is your asset.

True predictive solutions know that your workforce data scientists will want to use your talent assessment data to find correlations and predictions in many areas of your business. You need to insist on easy and direct access to the raw assessment scores.

How easy is it to deploy the solution into the talent acquisition process and use the predictions? How much training is required? Do your talent acquisition professionals need to read long text reports or get out a calculator to use the predictions? The complexity of a prediction should be kept out of the way of daily operations. If your team still needs to “think” about what the answer is, it is probably not a predictive solution.

Is there a different assessment for every role? Or one assessment with multiple predictive models? Multiple assessments make it impossible to predict one candidate’s performance against multiple roles. This may also be a signal that you are working with an older, legacy (less predictive) talent assessment supplier.

Is there an answer key for their solution on the web? For many assessments, there are answer keys and guides on how to fool or pass the test, which means two things: 1) the test is easily fooled, lacking internal controls to prevent spoofing, and 2) you are looking at an “industry benchmark” with one clear set of answers.

A data science-driven model would be custom to your role in your company, continuously evolving and therefore very difficult for answer keys and spoofing to catch.

Ask to see the company policy on employee predictive modeling, discrimination, disparate impact and fairness. It is important that a predictive solution has thought through the specific outcomes of its models and how they fit into creating fair opportunity for all applicants. In particular, it is vital for the solution to satisfy or exceed any government requirements for hiring and selection.

Do your own (internal) data scientists approve of the predictive solution? Ask one of your own data scientists (from HR, marketing or another area inside your own company) to accompany you in your evaluation. They should know what is a rigorous approach and what is marketing fluff.

How does the predictive solution regularly prove to you that the models are working? Ideally the company you select will be able to show you two to four times a year how your predictions are working (i.e., turnover is going down, sales are going up, calls are going up, errors are going down, etc.).


Greta Roberts is CEO of Talent Analytics, Corp., which helps to solve employee performance & attrition challenges through its predictive talent analytics software platform.

business analytics news and articles

Save

Save

Related Posts

  • 63
    Eric Siegel, founder of Predictive Analytics World (PAW, a series of conferences held throughout the year in major U.S. and European cites) and the author of the new book, “Predictive Analytics: The Power to Predict who will Click, Buy, Lie or Die,” is without question a key player in the…
    Tags: analytics, predictive, data
  • 57
    Use of the term “business analytics” is being used within the information technology industry to refer to the use of computing to gain insight from data. The data may be obtained from a company’s internal sources, such as its enterprise resource planning application, data warehouses/marts, from a third party data…
    Tags: analytics, data, business, predictive
  • 55
    Cathy O’Neil’s provocative book, “Weapons of Math Destruction: How big data increases inequality and threatens democracy,” created quite a stir in the analytics community when it was released last fall.
    Tags: analytics, data, models, predictive
  • 54
    Hewlett-Packard (HP) knows there are two sides to every coin. The company has achieved new power by predicting employee behavior, a profitable practice that may raise eyebrows among some of its staff. HP tags its more than 330,000 workers with a so-called Flight Risk score. This simple number foretells whether…
    Tags: predictive, analytics, data
  • 49
    By Eric Siegel Note: This article is excerpted from Eric Siegel’s foreword to the recently released book, “Mining Your Own Business: A Primer for Executives on Understanding and Employing Data Mining and Predictive Analytics,” by Jeff Deal and Gerhard Pilcher. For predictive analytics to work, two different species must cooperate…
    Tags: predictive, data, analytics, business

Analytics Blog

Electoral College put to the math test


With the campaign two months behind us and the inauguration of Donald Trump two days away, isn’t it time to put the 2016 U.S. presidential election to bed and focus on issues that have yet to be decided? Of course not.




Headlines

Survey: Despite the hype, AI adoption still in early stages

The hype surrounding artificial intelligence (AI) is intense, but for most European businesses surveyed in a recent study by SAS, adoption of AI is still in the early or even planning stages. The good news is, the vast majority of organizations have begun to talk about AI, and a few have even begun to implement suitable projects. There is much optimism about the potential of AI, although fewer were confident that their organization was ready to exploit that potential. Read more →

Data professionals spend almost as much time prepping data as analyzing it

Nearly 40 percent of data professionals spend more than 20 hours per week accessing, blending and preparing data rather than performing actual analysis, according to a survey conducted by TMMData and the Digital Analytics Association. More than 800 DAA community members participated in the survey held earlier this year. The survey revealed that data access, quality and integration present persistent, interrelated roadblocks to efficient and confident analysis across industries. Read more →

UPCOMING ANALYTICS EVENTS

INFORMS-SPONSORED EVENTS

2017 Winter Simulation Conference (WSC 2017)
Dec. 3-6, 2017, Las Vegas

CAP® EXAM SCHEDULE

CAP® Exam computer-based testing sites are available in 700 locations worldwide. Take the exam close to home and on your schedule:


 
For more information, go to 
https://www.certifiedanalytics.org.