• cool stuff
  • Blog
  • Known biases can be removed with the right data and testing

Hiring with AI, fairer, faster and better

Known biases can be removed with the right data and testing

BY Team PredictiveHire

post-fetured-img

As humans, we often don’t trust what we can’t see and we can’t trust what we don’t understand.

We believe that if an algorithm affects someone’s life you need to see how that algorithm works.

Transparency and explainability are fundamental ingredients of trust, and there is plenty of research to show that high trust relationships create the most productive relationships and cultures.

We are committed to building ethical and engaging assessments. This is why we have taken the path of a text chat with no time pressure. We allow candidates to take their own time, reflect and submit answers in text format. Apart from the apparent errors related to facial expressions, we believe that technologies such as voice to text can add an extra layer of errors. We also refrain from scraping publicly available data such as LinkedIn nor do we use behavioural data like how fast a candidate completes or how many corrections they make. We strictly use the final submitted answers from the candidates and nothing else.

Our approach has led to candidates loving the text experience, as measured by the feedback they leave and NPS.

FirstInterview is a true blind assessment

No demographic details are collected from candidates nor used to influence their ranking. Only the candidates answer to relevant interview questions are analysed by our scientifically validated algorithm to assess their fit for the role.

Bias can be removed with the right data.

Biases can occur in many different forms. Algorithms and Ai learn according to the profile of the data we feed it. If the data it learns from is taken from a CV, it’s only going to amplify our existing biases. Only clean data, like the answers to specific job-related questions, can give us a true bias-free outcome. We continuously test the data that trains the machine for known biases such as between gender and race groups, so that if ever the slightest bias is found, it can be corrected. Potential biases in data can be tested for and measured. These include all assumed biases such as between gender and race groups that can be added to a suite of tests. These tests can be extended to include other groups of interest where those groupings are available like English As Second Language (EASL) users.

Here are a few examples:

  • Proportional Parity Test “Is there an adverse impact on our recommendations?”
  • Score Distribution Test “Are the assessment score distributions similar across groups of interest?”
  • Fairness Test “Is the assessment making the same rate of errors across groups of interest?”

PredictiveHire uses all of these tests and more.


Join the movement

To keep up to date on all things “Hiring with Ai” subscribe to our blog! 😀

You can try out PredictiveHire’s FirstInterview right now, or leave us your details here to get a personalised demo.

Do you want to
hire faster?

If the answer is “Yes” then watch the video to see how introducing AI into your hiring will make it 90x faster and deliver you brilliant results.

Get Loved by Candidates!

Don’t listen to us, read what real candidates who have experienced Predictive Hire say!

READ OUR REVIEWS!

RECENT BLOGS

Do you want to
hire faster?

If the answer is “Yes” then watch the video to see how introducing AI into your hiring will make it 90x  faster and deliver you brilliant results.

Get our insights newsletter to stay in the loop on how we are evolving PredictiveHire