There are some steps we can take to eliminate bias in recruitment and it begins with not relying on CVs as a method of evaluating candidates.
CVs are full of information that is irrelevant to assessing a person’s suitability to do a job.
They instead highlight things that we often use to confirm our biases, and draw our attention from other key attributes or aptitudes that might make someone especially suitable for a job.
For example, if a CV mentions a certain university it might pique our attention (a form of pedigree bias). This is problematic, as there may be socio- economic reasons why someone attended a certain university (or did not attend another) and CVs do little to reveal this. Situations like this confirm the bias that lead to it in the first place, compounding bias for these long-term systemic issues.
Additionally, CV data reduces a candidate pool in a way that is not optimising for better fits for the role, by relying on the wrong input data and criteria to find a candidate. Amazon discovered this when it abandoned its machine learning based recruiting engine that used CV data when it was discovered the engine did not like women.
Automation has been key to Amazon’s dominance, so the company created an experimental hiring tool that used artificial intelligence to give job candidates scores ranging from one to five stars.
The issue was not the use of Ai, but rather its application. Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men,
a reflection of male dominance across the tech industry. As a result of being fed predominantly male resumes, Amazon’s system taught itself that male candidates were preferable. It penalised resumes that included the word ‘women’ as in “women’s chess club captain.” It also downgraded graduates of all-women’s colleges.
Studies have shown systemic unintended bias occurs when reviewing resumes that are identical apart from names that signify a racial background or gender, or a signifier of LGBTQIA+ status.
The solution for this has been to remove names or any identifiable data from an interview or CV screening, but these have still experienced bias issues like those discussed earlier.
In order to be truly blind, any input data needs to be clean and objective. This means that it gives no insight into someone’s age, gender, ethnicity, socio-economic standing, education, or even past professional experience.
To truly disrupt bias, recruiters and hiring managers should utilise a new wave of HR tech tools such as PredictiveHire, stepping away from using CV data as a way to determine job suitability.
Get our insights newsletter to stay in the loop on how we are evolving PredictiveHire