AI recruitment tools, can computers tell if a technical candidate’s a good fit?

Can AI recruitment tools help your company hire technical candidates with the best fit and also support your diversity and inclusion strategies? Not yet says Alex as he explores the challenges of candidate bias in the technical fields.

AI recruitment tools, can computers tell if a technical candidate’s a good fit?

Artificial intelligence (AI) has been widely touted as the technology that’s going to revolutionise recruitment for HR departments, technical candidates and recruitment agencies.

According to Undercover Recruiter, AI is expected to replace 16% of HR jobs within the next 10 years. Great news for companies that are looking to drive efficiencies by either reducing staffing or focussing their staffs’ skills on retention, engagement, wellbeing and other HR functions. But is it really going to deliver better candidates for your technical roles?

One of the ways AI is expected to improve recruitment is to help screen candidates and decide whether or not they’re a good fit for the employer. Knowing how many applications our clients can receive for highly sought after roles, this sounds like an excellent idea.

Candidate bias and AI recruitment tools

In theory it should also help reduce unconscious bias when shortlisting candidates – when a recruiter might favour a candidate over another because of cultural preferences such as the school they attended or the sports they play. AI will assess candidates on their skills, qualifications, experience and other relevant criteria, not whether you might have played rugby with them in the past.

However, machine learning tools can be biased. In fact, last year Amazon scrapped its AI recruiting tool because it showed a bias against female candidates. The issue with AI and machine learning is that the machine has to learn from training data. In recruitment it uses CVs and application data from successful past candidates, perhaps also factoring in their performance having secured the role.

This poses a problem for many employers in the sectors we recruit for because there are higher numbers of men in the industry – construction, engineering, technology, cyber security etc. Using historic data, AI recruitment tools will inevitably prioritise applications from men because that’s been the dominant trend for many years.

Amazon attempted to change its AI programme to make it more gender neutral, but eventually abandoned the project altogether because it couldn’t trust that the results didn’t have a bias. Training data could have other factors that would either produce results with a gender bias or discriminate in other ways. If your historic recruitment has been affected either by deliberate or unconscious bias in the past, or trends in the candidate market, other patterns in data might also give biased results.

Using the example of education: if historically you’ve recruited more candidates from public schools for whatever reason, AI recruitment tools will probably still deliver gender biased results. Historically your successful candidates will have come from boys schools.

How can we get the benefits of AI without the bias?

While Amazon decided that the technology challenges were too much to overcome, I still think that AI recruitment tools will be an asset to any HR department or technical recruitment agency in the future.

By understanding the limitations of machine learning and how training data can deliver biased results, we can take steps to level the playing field for all qualified candidates and reduce bias. That could mean using positive discrimination to ensure that underrepresented candidates get more opportunities to be shortlisted for technical roles, although technology experts believe that a fairer algorithm is some way off.

In the meantime, as recruiters, we should learn from these failed attempts to streamline the screening process with AI. Diversity and inclusion recruitment strategies need to acknowledge that bias isn’t just created when a white male is deliberately favoured over a BAME or female candidate, it can also come from when someone unconsciously favours a candidate with particular interests or shared experiences. While these factors can often result in a good cultural fit with your existing team, they can perpetuate less desirable cultures and exclude excellent candidates that would be an asset to your company.

If you need support recruiting technical candidates in the construction, engineering and technology fields, or help improving your diversity and inclusion recruitment strategies, please get in touch. We’d be delighted to share our experience as recruiters in these fields.

Leave a Reply

Your email address will not be published. Required fields are marked *