How AI is making recruitment fairer
The recruitment process can be a challenging process, starting with writing the perfect job advert and ending with making the tough decision of which interviewee would be best suited to fill the position.
A common issue that is prevalent throughout the recruitment process is the problem of bias.
Unfortunately, even when we have very best intentions, unconscious biases – defined as “automatic, mental shortcuts used to process information and make decisions quickly” -can have a negative impact on our recruiting decisions. What makes this difficult is that it is outside our control.
There is a wealth of research that shows the benefits of encouraging diversity in the workplace. It adds a variety of perspectives, opinions, knowledge, and skills to the table.
But with an inherent tendency to be biased when hiring, how are businesses able to create a diverse workplace?
Fortunately, we can enlist the help of technology to solve this very human problem. We are seeing an increase in the number of startups aiming to combat the problem of unconscious bias in hiring.
Here are just a few examples of companies changing the recruitment game with AI:
This is a web-based application used by big named companies such as Twitter, Johnson & Johnson, and Expedia. It uses artificial intelligence to assess communications to ensure job advertisements are not gender-biased.
Textio acts like a sort of spelling and grammar checker, flagging up words and phrases that are gender-biased, or otherwise off-putting.
Pymetrics is a gamified recruitment tool which aims to reduce the bias in recruitment through their hiring platform. Instead of sending in a resume consisting of personal information that may skew bias in the recruiter, Pymetrics makes you play neuroscience games that let you solve different tasks.
Such tasks assess your cognitive and emotional traits, gathering much denser data then what could be found on a resume.
This blind-screening method of hiring has, so far, achieved favorable results for many of the companies big clients such as Unilever, Tesla, and LinkedIn.
According to Pymetrics, 18 percent more females were hired and 16 percent more candidates from minority backgrounds.
Checkr is an AI tool that carries out background checks on prospective employees.
The “ban of the box” movement urges companies to remove the “have you ever been convicted of a crime?” checkbox on job applications, in order to give people with past convictions a fairer chance at employment. Many candidates have their applications tossed in the trash, regardless of what the crime may have been.
Co-founder of Checkr, Daniel Yanisse, says that the software was designed to reduce bias in hiring and give qualified candidates a chance at employment.
According to ACLU research, almost 75 percent of formerly incarcerated individuals are still unemployed a year after release. Furthermore, according to ACLU, employee retention is greater for people with criminal histories than the general population.
The Checkr works by allowing clients to customize the software to ignore data about certain criminal offenses that may not be relevant to the specific job.
For example, a traffic violation may not have any impact on someone applying for a customer service role. But if the job was for an Uber driver, it probably would matter.
Companies can also customize the software to account for time frames. For example, the crime may have happened before a certain year, making it no longer a real concern.
While humans are still very much needed in the hiring of candidates, AI can help foster a more diverse working environment by minimizing bias and focusing on the hiring the right person for the job.