When used properly, data amplifies inclusive hiring.

INCLUSIVITY CREATES FAIRNESS

Humans are prone to unconscious bias

When a recruiter first screens a resume, they do so for ~6 seconds. So what is it that they are seeking?

A job role can attract thousands of applicants. When recruiters initially screen applicants, they are looking for ‘short-hand’ clues to confirm a pre-existing judgement that ‘predicts’ success, like a particular degree or score. Without knowing it, this method adversely affects the ability to hire the best candidates.

There is a better way.
FirstInterview is a true blind assessment . No demographic details are collected from candidates nor used to influence their ranking. Only the candidates answer to relevant interview questions are analysed by our scientifically validated algorithm to assess their fit for the role.

laptop
chat-person

BLIND SCREENING CREATES TRUST

Everyone’s story is bigger than their CV

People are so much more than their CV, yet favouring a name, gender or institute over the individual is a common practice.

We cannot build inclusive industries when we don’t take steps to remove unconscious bias in our decisions when we are hiring. Being aware of our bias is one thing; removing it is another entirely.

It starts with a conversation. And a fair go.
Using FirstInterview means everyone gets the chance to do an interview and an opportunity to tell their story.

HOW IT WORKS >

WHERE DIVERSITY MATTERS

Bias can be removed with the right data

Algorithms and Ai learn according to the profile of the data we feed it.

If the data it learns from is taken from a CV, it’s only going to amplify our existing biases. Only clean data, like the answers to specific job-related questions, can give us a true bias-free outcome.

We continuously test the data that trains the machine so that if ever the slightest bias is found, it can be corrected.

Here's the science >
bias-image
X

Bias testing - holding ourselves accountable

Biases in data can be tested for and measured. These include all assumed biases that can be added to a suite of tests. PredictiveHire uses all of these tests and more.
test_box_image

Proportional Parity Test

"Is there adverse
impact in our
recommendations?"

pink-plus
test_box_image

Score Distribution Test

"Are the assessment score distributions similar across groups of interest?"

purple-plus
test_box_image

Fairness
Test

"Is the assessment making the same rate of errors across groups of interest?"

blue-plus
WANT MORE INFO? LET'S CHAT
anomolies

DETECTING ANOMALIES

Text-based Ai can’t be outsmarted

It’s hard to detect gaming in multi-choice personality assessments, nor can you detect plagiarism.

Ai is a super detector
Anomalies in answers are detected and flagged, including those that contain:

  • plagiarism
  • responses that are meaningless
  • semantic anomalies that are significantly different to expected answers
  • profanity

Zero-biased hiring is here now.
Join the movement.

Here are our latest articles on reducing bias, removing discrimination
and building inclusive workforce through better hiring.

Deterring age discrimination … count those mature hires ‘in’!

Once upon a time … we were all happily employed and worked in our jobs until we reached the age of 65. Then we retired with a gold watch and lived happily ever after.  While […]

Read On

Diversity hiring and six top tips to get it working for you

What is workplace diversity? While workplace diversity might once have been considered a ‘nice to have’, today it’s a ‘must-have’ for employers who recognise the value it brings to their organisation. The idea of workplace […]

Read On

Mirror mirror on the wall …

In recent years, we have all wisened up to the risk of using CVs to assess talent. A CV as a data source is well known to amplify the unconscious biases we have. A highly […]

Read On

Yes, reducing bias is firmly on our agenda.

Let's find a time to chat and make a difference.

Get our insights newsletter to stay in the loop on how we are evolving PredictiveHire