GET YOUR FREE HR STRATEGY SESSION
Close

By: Ber Leary on June 13th, 2023

Print/Save as PDF

Artificial Intelligence in HR Could Make Hiring More Biased

HR Tech | Talent Acquisition

Artificial Intelligence is the hottest technology in the world right now, with huge implications for every aspect of life--including the way we hire.

AI is beginning to appear in popular HR software like Applicant Tracking Systems (ATS), with the promise of faster, better, more accurate hiring results. An AI can screen thousands of resumes per second, while AI chatbots can answer candidate questions and even conduct screening interviews. 

But will automated recruitment help eliminate discrimination? Or will it make hiring even more biased?  

What the EEOC says about artificial intelligence

The Equal Employment Opportunity Commission (EEOC) has issued some guidelines to help employers think about the way they use AI in hiring. 

AI is a fast-moving technology, so they EEOC has not offered any guidance on software or algorithms. Instead, they've laid out some core principles to bear in mind: 

  • Software tools are part of your selection process. Your candidate selection process must be equitable and free of bias. AI tools play a vital role in this process, so you need to ensure your software is fit for purpose.
  • Vendors bear no responsibility. Software vendors might guarantee their platform is 100% unbiased, but that is a private agreement between you and the vendor. It does not affect your responsibility to employees and job seekers. 
  • Employers can test different approaches. Because AI is evolving rapidly, employers may need to test and iterate before they find a working solution. You can try out different processes as long as you monitor hiring outcomes to identify signs of bias. 
  • You must choose the most inclusive algorithm. When making decisions about AI and HR software, you should always choose the option that is least likely to result in discrimination. Failure to do so could leave you open to complaints. 

Essentially, there's no real change here. Employers have always been responsible for ensuring that their hiring process is fair and equitable. Otherwise, you may be open to fines and legal actions. 

The EEOC guidelines are just confirming that AI doesn't affect this responsibility. You're free to use AI hiring tools if you wish, but if those tools cause problems, then you're fully liable. 

Download Your HRIS Systems Rating Sheet

How bias affects artificial intelligence in HR

To understand how a machine can discriminate, we have to look at how AI works.

Contemporary AI uses a technique called Machine Learning. ML is a software process that studies huge volumes of information (known as “training data”) and identifies deep-lying patterns. The system learns from these patterns, and then it uses this knowledge to make decisions.

So, for example, imagine you’re training an AI recruitment tool. You might start by giving it access to all of your hiring data and your file of resumes. The AI will study them and learn things about your previous hiring patterns like:

  • Which resumes lead to an interview?
  • Which resumes get rejected?
  • Which candidates get hired?
  • How long did each candidate stay in their position?
  • Who went on to a leadership position later in their career?

Once the AI understands your historical hiring activity, it can make decisions about the future. The AI will flag up promising candidates and filter out anyone unsuitable.

If you can already see the problem here, you’re one step ahead of Amazon’s engineers.

AI learns from historical data, which contains all of our current biases. For example, if you asked an AI to look at Fortune 500 companies’ current leadership, it would notice a clear pattern: 92.6% of CEOs are men. Based on this data, an AI would determine that male candidates are better suited to leadership roles.

This seems to be what happened in systems such as Amazon’s failed human resources AI project. Historical data said that most previous successful hires were men; therefore, the AI prioritized applications from men. And even if the resume doesn’t explicitly mention gender, the system might discriminate in other ways. For example, it could ignore candidates who went to an all-women college, played on a women’s sports team, or worked in a field traditionally associated with female employees.

How to avoid AI bias in hiring

Should we use artificial intelligence in HR at all?

Of course we should. AI is already a major part of human capital management, with 70% of large companies reporting significant gains in productivity when they adopt automation.

And AI can help to reduce discrimination in the recruitment process, resulting in a more diverse workforce, when used correctly. It’s all a matter of context, oversight, and data quality. A McKinsey report on AI puts the issue in a nutshell: “AI can help reduce bias, but it can also bake in and scale bias.”

If you’re working with AI tools such as automated applicant tracking systems, here are a few things you need to keep in mind:

1. Always investigate the training data

Every AI or machine learning tool was developed with a set of training data. Talk to your software vendor about where this data was sourced, how it was processed, and what measures were taken to remove bias from the data. Many software vendors will be proud to tell you about their anti-discrimination strategy.

2. Broaden your search for candidates

AI can only work with the data you give it. If you want your system to produce diverse candidates, you’ll have to supply it with diverse applicants. This means reviewing your process and addressing difficult questions like:

  • Are we advertising vacancies in the right places?
  • Do we use inclusive language in our advertising?
  • Does our employer branding demonstrate and promote diversity?
  • Are we using employee referral programs to reach out to minority communities?
  • Do we work with recruitment partners who are equally committed to inclusiveness?

An AI can’t help you answer any of those questions. But if you start attracting diverse candidates, an AI can help you find the brightest talent.

3. Diversify your HR team

Bias is usually unconscious. We don’t all have the same lived experience, so we can’t perceive the challenges others face. That’s why it’s essential to have a wide range of perspectives within the HR team, so that our colleagues can flag up the problems we might have missed.

Remember – representation alone isn’t enough. Everyone must have an equal voice so they can start a conversation about potential inclusiveness issues. Work with your DE&I team and Employee Resources Groups to identify and tackle discrimination issues.

4. Keep a close eye on outputs

One great thing about AI tools is that you can analyze every stage of the process. Reports will give you detailed data about which resumes were rejected, which ones progressed, and which were prioritized.

Pay close attention to the entire automated process and look for signs that bias may be creeping in. The EEOC recommends the "four-fifths rule" as a rough guide—if one demographic is consistently 80% less successful than other demographics, then you may have a biased hiring process.  

5. Don’t forget the human touch

Hiring is ultimately about people. While AI can sort through applications at record speed, only a human can form a meaningful connection with a candidate. Humans can also talk to candidates about organizational culture and help people understand their future role in the company.

AI can take a lot of work out of recruitment, and it can go some way towards reducing unconscious bias. But to find and retain the best talent, you’ll need a hiring process that’s made by humans, for humans.

New call-to-action