Diversity pays. Organisations that promote diversity and eliminate bias find that their share prices significantly outperform the market. This according to Credit Suisse in their Gender 3000 report that analyses the commercial impact of diversity.
Because of this diversity premium, investors are increasingly looking at how boards and executive teams prioritise diversity and eliminate bias. Recruitment, a process at the core of any company’s ability to attract and retain the best talent, is particularly prone to bias. It is also one of the easiest to debias.
Recruiters and hiring managers come with deeply entrenched cognitive biases, heuristics used to make judgements from which bias arise. Even the most fair-minded hiring manager cannot help but feel just a tiny bit more affection for somebody from the same hometown or who attended the same school. Or be swayed favourably, even if just slightly, by a candidate’s self-confidence. Or pay too much attention to a dress, jewellery or a haircut, and too little to the things that matter.
The list of biases is long and well-documented. To date, no researcher has found a functioning human devoid of any bias. Indeed, to be human is to be biased.
The evidence suggests at best only a little and only for a short period. It is easier, and more effective to debias the recruitment process itself. By using structured interviews and a consistent approach, an algorithm if you will, to rate candidates on factors that predict job performance. Better still if it is the same interviewer.
This careful configuration, however creates a scalability problem. Human recruiters have only so much capacity, and when volumes increase, the process begins to fail. Bias is reintroduced.
To effectively debias recruitment, you have to focus on both process and scale.
This is where AI assistants, such as PHAI, come in. Just like spreadsheets revolutionised the work of finance professionals a few decades ago, so will AI-assistants exponentially increase the quality, speed and fairness of recruiters and hiring managers.
The evidence is compelling. Research suggests that the typical human decision-maker will favour one gender over the other by approximately 11% to 36%. That is 11% to 36% more likely to recommend one gender over the other. So on a good day, within a good process, the best we can expect of a human recruiter is about 11% bias, and we should be prepared for anything up to 36%.
A fair number of recruiters will fail to perform within the bias limits advocated by regulatory bodies such as the EEOC with the 4/5ths rule
Can an AI assistant such as PHAI do better? By measuring the gender bias over 12 months for 821k applications, a compelling picture emerges. The average gender bias detected is 1.3%, making PHAI about eight times less biased than a reasonably unbiased human.
The chart below shows the gender bias detection analysis by industry over the same 12 month period. In each case, it significantly outperforms what you can expect from even the least biased human. The AI is much better at giving every gender a fair go and effortlessly deals with vast volumes.
Sometimes bias does creep into AI-assisted judgment, but it is much easier to fix in an AI than in a human. For a large, multi-national discount retailer, a minor gender bias was detected six weeks after deploying a new model, which can be seen as the difference between the actual vs expected values in the chart below. To debias the model was a simple retraining exercise of a few hours with a data scientist. After deploying the retrained model, the AI assistant became virtually gender bias-free.
And that is the promise of AI in recruitment. Eliminating bias at scale, unlocking the Diversity Premium.
Get our insights newsletter to stay in the loop on how we are evolving PredictiveHire