Fair for candidates

When every applicant answers the same questions, the playing field is levelled. Everyone who completes the questions has a fair and equal chance of securing the role, and there are no right or wrong answers.

Fair for customers

Every customer wants to know they have reached the full potential of their talent pool. Using our technology to shortlist candidates means you never miss out on talent, and you preserve and enhance the diversity of your talent pool.

Fair data use

If you are using data which looks at age, gender, ethnicity and visible markers of bias, then you will amplify the bias in your machine learning. None of that data makes its way into our predictive models.

When it comes to using training data to build models, we perform health checks on every model that we build – removing bias before our models are deployed.

The benefit of machine learning is that the outcomes are testable and corrective measures remain consistent, unlike in humans. The ability to test both training data and outcome data, continuously, allows us to detect and correct the slightest bias if it ever occurs.

Get our insights newsletter to stay in the loop on how we are evolving PredictiveHire