Psychometrically brilliant, psychologically flawed

March 6, 2014

stethoscopeDoctor, doctor

First, imagine the scenario. You’re lying in a hospital bed. The doctor comes to see you. You are given a physical examination and asked a series of probing questions. A stethoscope is applied to your chest and to your back; and your temperature and blood pressure charts are scrutinized. Some blood tests are ordered and you’re sent for a scan. Later that day the doctor returns and provides a diagnosis.

The doctor has used the information in a particular way. A number of hypotheses about what may be wrong with you have been considered. Along the way, individual pieces of information may or may not have been useful; indeed may not have been relevant at all, or completely misleading. However the final diagnosis is based on a skilled integration of all the available information. It has been filtered through the doctor’s general understanding of how the body works, plus expert knowledge of a range of conditions and illnesses.

Now, a second scenario: You are an HR manager wanting to recruit a senior manager. You have an up-to-date job description and have compiled a thorough specification of the person you think is required to do the job well. You decide to use a variety of methods to gather information about the candidates. These include asking about their track record, and any particular knowledge and skills that are relevant to the job. You also decide to use psychometric tests to measure specific abilities, and a personality questionnaire to explore how candidates are likely to behave; plus an interview to gather evidence on key competencies, and on things like commitment and motivation.

The right blend

The blending of assessment information requires the same skills as those used by the doctor. The information from an interview depends on the candidate telling the truth, as does anything that is said about previous work history – although to a certain extent this can be verified by previous employers. The results of psychometric tests are of most utility if they are used to measure abilities that are directly relevant to the job – albeit that having a handle on a candidate’s general mental ability has been shown to be one of the best predictors of work performance.

Personality questionnaires fall into a different category, as despite sophisticated designs they are open to a candidate putting a positive spin on their answers. However, that being said, trait-based questionnaires can accurately predict behaviour, and as such they are genuinely useful in providing prompts for interview questions.

Of course no process is one hundred percent predictive, and there will be occasions when the wrong selection decision is made. This is inevitable, but a well designed process, using a balanced range of exercises, will significantly increase the odds of picking a winner. Indeed the odds will be increased to maximum if the person dealing with the information is appropriately trained and experienced: like a business psychologist. This is where the true value is added, especially when it comes to deftly combining information that ranges from the robust and objective, to other sources which may be completely ‘self-report’.

It doesn’t work (or does it!)

Criticisms when things go wrong are often of a binary nature, especially in the Press. For example, if tests or questionnaires have been used (journalists unfailingly lump them together), and the ‘wrong’ candidate is perceived to have been recruited, then it is all too easy to blame the tools. The argument is often that the person sailed through the psychometrics but turned out to be a psychological liability. Thus the answer to the – do ‘tests’ work? question – is obviously a resounding ‘no’. The problem with this logic is that it ignores everything else that has happened, including the due diligence that should have occurred before candidates completed any tests or questionnaires.  But let’s face it, poking fun at psychometrics makes a better story, and provides a golden opportunity to use a picture of an inkblot.

In reality psychometrics are the most predictive tools that are available, certainly many times better than merely relying on an interview. So it’s something of a paradox that the more senior the position the less likely the candidates are to be assessed fully. It’s a bit like going to hospital, announcing that you are the CEO of a bank, and the doctor sitting on the end of the bed, asking a couple of questions, and making an instant diagnosis. No need for all those irritating tests, I can tell what’s wrong with you by enquiring about your golf handicap.

To be serious again, perhaps this is where the attention should be focused, on getting the recruitment process for the ‘top jobs’ properly sorted out: and the case is actually made for more assessment, not less.

The research: Do tests work?

What are candidate assessment measures measuring?

September 8, 2011

Do candidates who are better at predicting what employers are looking for – who have a well-developed ‘ability to identify criteria’ – actually do better at assessment events? Yes… in fact research seems to show that this ability correlates more strongly with job performance than assessment centre scores themselves!

Read this BPS Research Digest article for the full story: What exactly are candidate selection measures measuring?

Psychometrics & return on investment

September 6, 2011

It’s self-evident that selecting people who are better at doing a job, whatever that job is, will have an effect on your bottom line. There’s also plenty of research that supports the idea that psychometric tests and questionnaires, amongst other assessment techniques, can give a significant boost to your ability to spot winners. But, how can you show the benefits in terms of hard cash and/or reduced staff turn-over?

A number of psychometric test publishers and occupational psychology firms have started to produce Return on Investment (ROI) calculators. These can be used to estimate the financial gains achieved by increasing the quality of hires, and also provide a guide to the likely reduction in staff turnover. If you want to know how they work, or would like to try putting your own figures through, try one of these:

You’ll also find a very readable and insightful article from Talent Q here: Introduction to Measuring the ROI of Assessment.

Free psychometric tests & questionnaires

August 22, 2011

In my mission to try to steer people towards useful psychometric practice material, I now add interesting thoughts and snippets to this Facebook page:

This provides links to properly stimulating stuff (e.g. tests & questionnaires that can be used to prepare for selection events), amusing free surveys (e.g. do you have a male or female brain?), interesting articles (e.g. do entrepreneurs have lucky personalities?), and even test publishers who will pay you to try their latest products!

Now and again I also add links to the growing number of psychometric apps for smart phones, Facebook and the like. For instance, did you know that more than three million people have used the free (Big Five) myPersonality app at:

Facebook really is changing the ‘face’ of psychological research…

Do people cheat on psychometric tests?

August 15, 2011

Not as much as you might think… Read the latest research from Psylutions.


Fairy tales and predicting good leaders

August 8, 2011

“A common phenomenon and problem in leadership practice concerns undue reliance on popular fads without sufficient consideration given to the validity of those ideas…” Click on the link below to read a very good review from Amazure Consulting for the evidence on cognitive tests, personality, situational judgement, emotional intelligence and interviews being ‘effective’ predictors of leadership ability.

Do situational judgement tests work?

May 25, 2011

Situational judgement tests (SJTs), or tests that assess a candidates ‘preferred’ responses to a range of work-based scenarios, are growing in popularity and are now commonplace in many graduate and management recruitment processes. But do they work? Well, it seems there’s pretty good evidence that SJTs do predict job related criteria such as sales performance or ratings by managers. The first really thorough analysis, conducted by McDaniel et al (2001) across 95 different studies, concluding that the correlation between SJTs and job performance is in the region of 0.34. Incidentally McDaniel also found that when SJTs were closely matched to the job in question – via a properly conducted job analysis – the figure rose to 0.38. 

The same figure was reported earlier this year by SHL Group, with a composite of 0.38 being achieved for a ‘relating & networking’ criterion and one of their SJTs which is being used by a global retailing organisation.

In addition various studies have looked at whether SJTs significantly add to the prediction of job performance over and above that which is achieved by using measures of cognitive ability (psychometric reasoning tests), job experience and personality. Again McDaniel et al (2007) have found that SJTs provide incremental validity over cognitive ability of between 3 and 5 per cent, i.e. they add something extra to an understanding of ‘thinking’ competencies; and of 6 to 7 per cent compared to personality questionnaires, i.e. they add even more to an understanding of how someone deploys their personality at work.

P.S. In the great scheme of things 0.3, which is a ‘moderate’ correlation, is the point at which things are starting to get particularly useful, especially if the assessment method in question is being used for volume recruitment.

Want to know more?

McDaniel, M.A. and Nguyen, N.T. (2001). Situational Judgment Tests: A Review of Practice and Constructs Assessed. International Journal of Selection and Assessment, 9(1-2), 103-113.

McDaniel, M.A., Hartman, N.S., Whetzel, D.L. and Grubb, W.L. (2007). Situational Judgment Tests, Response Instructions and Validity: A Meta-analysis. Personnel Psychology, 60(1), 63-91.

Lievens, F., Peeters, H. and Schollaert, E. (2008). Situational Judgement Tests: A Review of Recent Research. Personnel Review, 37 (4), 426-441.

How many successful entrepreneurs would fail an IQ test?

April 2, 2011

If you would like to be part of some research to find out, follow this link. Online survey designed by Tomas Chamorro-Premuzic of Goldsmiths College, University of London.

Interesting footnote: Tomas was the resident psychologist on Big Brother!

Free stuff @

February 12, 2011

Graduates! Just another reminder that there’s a multitude of free psychometric tests and questionnaires on the ‘links’ page of my website at:

Also a growing number of other sites that offer free taster tests, for example you will find verbal, numerical and inductive reasoning tests at:

Do psychometric tests work?

February 3, 2011

Good question. What people usually mean when they ask if they work is: do tests predict anything useful about future work performance? The short answer is a resounding ‘yes’. As long as a test is used to measure an ability that is actually required of a particular job, then predictive validities are often in the 0.5-0.6 range. What this means is that at the top end of the scale, a test (the predictor) explains 36% ((0.6 x 0.6) x100) of the variance in the criterion – the criterion being something like a measure of productivity. By way of contrast other assessment methods such as the interview are often far less effective. A semi-structured interview would weigh in at 0.38 (14%) or thereabouts. And to get the whole thing in perspective, just in case you’re not impressed, in other fields such as the drug industry, predictive-type validities are often lower. For example, the association between Ibuprofen (the well-known anti-inflammatory) and pain reduction is in the region of 0.14 (2%) – see Robert Hogan’s article, details below.

Want to know more, here are some key references:

  • Bertua, C., Anderson, N., and Salgado, J.F. (2005). The Predictive Validity of Cognitive Ability Tests: A UK Meta-Analysis. Journal of Occupational and Organizational Psychology, 78(3), 387-410.
  • Hogan, R. (2005). In Defense of Personality Measurement: New Wine in Old Whiners. Human Performance, 18, 331-341.
  • Hunter, J.E, & Hunter, R.F. (1984). Validity and Utility of Alternative Predictors of Job Performance. Psychological Bulletin, 96, 72-98.
  • Schmidt, F.L, & Hunter, J.E. (1998). The Validity and Utility of Selection Methods in Personnel Psychology: Practical Implications of 85 Years of Research Findings. Psychological Bulletin, 124, 262-274.


Get every new post delivered to your Inbox.

Join 85 other followers