Can AI help employers overcome bias when hiring?

Bias is often unintentional but it can severely damage the culture and environment in the workplace. Here's how AI can help avoid it.
6 April 2018

Avoiding bias in teams is critical, and AI can help. Source: Shutterstock.

B ias is prejudice in favour of or against one thing, person, or group compared with another.

In the workplace, bias or prejudice negatively impacts equality and diversity: two factors that are now well established as being the backbone to a successful and resilient company.

Most of the time, however, bias is completely unintentional and often learned implicitly.

Dr. Van Bavel and Dr. West, associate professors of psychology at New York University and senior scientists at the Neuroleadership Institute, writing in The Wall Street Journal, explain this further: “Research has shown that men and women alike start to treat minorities differently within milliseconds of seeing them. Our brains automatically carve the world into in-group and out-group members and apply stereotypes within the blink of an eye.”

Personal development and challenging internal assumptions can help combat bias in recruitment, but an easier solution might be to simply use artificial intelligence to weed out it out.

According to Indeed recruitment consultants, more than a third of people (35%) think AI will help remove bias from the recruitment process and make it fairer.

Half of those under 35 (48%) believe that automation will help make recruitment less biased and as many as a third (36%) think removing humans from the process entirely will solve the problem.

Taking this idea further, recruitment software specialist Workable has partnered with Human, an artificial emotional intelligence firm, to remove human bias and unconscious discrimination from the traditional job interview.

Via software that uses partial facial recognition, robust camera angles and state-of-the-art pixelated raw data, the technology deciphers how a candidate is feeling during a live interview. In particular, it detects key characteristic and behavioural traits through subliminal facial expressions. The system can be seen in action here.

“Imagine you don’t know who the candidate is behind a video interview? Their age, gender or race isn’t up for question. You simply review candidates based on their personality traits and make an informed decision,” says Yi Xu, CEO and founder at Human.

“Our technology provides a level of intelligence that was previously unattainable and removes some of the ‘damaging’ human bias that so often clouds our judgement.”

Human’s AEI software will be integrated with Workable’s all-in-one platform so recruiters can screen candidates three times faster and send out video interviews via their Applicant Tracking System and then link the results with the candidates’ timeline.

AI can also help reduce bias at the CV screening stage. Research shows that it takes 8.8 seconds for employers to read a CV. Simple things like a  name could sway someone in a split second to put a CV in the ‘reject’ pile. For example, a BBC investigation found that a job seeker with an English sounding name was offered a job three-times more than an applicant with a Muslim name.

To avoid this, AI-powered programmes can scan an applicant’s CV and take notice of the truly important bits: skills, writing accuracy, education etc.

MeVitae, for example, is working with  Oxford University Physics Department to measure how employers brains react to Harvard University’s IAT test, different CVs and sentences using Eye-tracking devices, neueoimaging and Emotiv Electroencephalogram.

It uses the information to identify employers unconscious bias and then, using big data and artificial intelligence, build a bias correction algorithm that scores, ranks and shortlists candidates for jobs, irrespective of social class, race, ethnicity, gender etc.

A similar platform called Textio uses algorithms to help recruiters write the most effective job post, avoiding language that might put certain people off applying. It analyses the hiring outcomes of more than 10 million job posts a month to predict their performance and give real-time feedback on how to improve it. According to the company, on average hiring teams using Textio Score recruit 23% more women.

AI can, of course, be biased itself if the programmer imparts its own prejudices onto the algorithm or if it is given a set of data from which it draws unintended conclusions.

But largely these are just kinks in a system that can be ironed out. An employer investing in AI to help root out bias is one that is being active on improving diversity in their workplace and at the very least the technology might provide some surprising insights.