Paper 2


Positive & negative prediction:
a criterion-related validity study with a global IT services company

Throughout this symposium, we were highlighting the importance of good validity data whilst acknowledging it can be challenging to collect. In our second paper, we shared insights and reflections from a major validity study we undertook with a global IT services and technology company.

The study involved understanding the relationship between behavioral competencies measured by our Wave assessments and the performance of a group of 200+ salespeople across EMEIA and North America.

We shared how the validity study showed that self-reported talent and motivation for many of the organization’s competencies, as measured by Wave, were significantly correlated with manager effectiveness ratings on those same competencies and, more importantly, overall performance. In particular, competencies relating to creativity, innovation and empowering people tended to have a strong relationship with good sales performance.

However, there were some surprises. There was a strong negative correlation between the behavioral competency of following procedures and overall performance.

As practitioners in the field of psychological assessment at work, we are naturally pleased with the positive correlations between our assessment and overall performance, but the negative correlations often tell you just as much.

Through sharing this case study, we outlined our approach to ensuring our assessments work, a summary of the results and what they meant to the client. We discussed our duty of care as practitioners in helping clients find appropriate meaning in the results.

Most importantly, we left attendees with clear guidance on how to run a best-practice criterion-related validity study to ensure organizations are not using assessments 'blind' to their effectiveness. Our top tips were:



  • Collect good performance data – end-of-year ratings are often insufficient and objective data rarely exists meaning a well-designed effectiveness survey for managers is often the best method.
  • Link into existing projects – validity studies can be resource-intensive so look for opportunities where some of the work is already being carried out (development programs where many participants are completing an assessment).
  • Generate hypotheses to test – this prevents the temptation to pick through a significant chunk of data and retrospectively decide which bits tell the story you want.
  • Be prepared to have complex, and sometimes difficult, conversations with your stakeholders to help them make the right interpretations from the data.

Find out about the other papers we presented at the DOP Annual Conference 2020:

PAPER 1

The practice of science in candidate assessment: are our choices informed by science?

PAPER 3

Introducing a new candidate selection method in an evidence-based way: an example from a UK retailer

Want to find out more about this topic?

If you are interested in finding out more about anything you have read about in this paper, please fill in the form below and a member of our team will be touch as soon as possible.

Email Sent Confirmation

Thank you for contacting us

One of our team will get back to you shortly.