KOHb - Getty Images

6 challenges of AI in recruitment

AI can save time during the recruitment process, but the tech also brings challenges. Learn what HR leaders should be aware of before incorporating AI into recruiting.

AI can save recruiters' time, but using the technology to find new employees can bring various challenges as well. HR leaders should be aware of these problems before relying on AI to carry out important recruiting tasks.

Some of the recruiting tasks AI can help with are resume screening, talent pool development and candidate experience. But problems such as AI bias could negatively affect a company's recruiting efforts and even open up the organization to legal action.

Here are some of the challenges of using AI in recruitment.

1. Bias

Bias in AI recruiting has become a major point of discussion, with an anti-bias law in New York City recently coming into effect. It requires companies to audit the AI recruitment technology they are using and make the conclusions public.

HR departments may use AI for resume screening, where a tech scans resumes for keywords related to qualifications, work experience and education. These kinds of AI tools also carry out talent pool development by scanning platforms such as LinkedIn for potential job candidates.

Using these tools comes with potential dangers. For example, biased tools may search LinkedIn only for people who have attended specific colleges and ignore candidates that don't fit that criteria but would also be a good fit, said Will Howard, director of HR research and advisory services at McLean & Company, an HR consultancy located in London, Ontario.

More harmful examples have led to lawsuits, with job candidates alleging that AI systems have rejected candidates based on race, age and disabilities.

Headshot of Will Howard, director of HR research and advisory services at McLean & CompanyWill Howard

In addition, HR leaders may not know the origin of the data that AI uses to carry out its processes, and that data could lead to bias.

Usually, training for a company's AI is conducted by third-party vendors who train their software based on data points from various organizations, Howard said. That data may be biased, which leads to the AI making biased decisions.

When working with AI vendors, HR leaders should inquire about the technology's bias detection capabilities, Howard said. These questions may include the following:

  • What measures is the vendor taking to mitigate bias in the AI models?
  • How transparent are the AI's decisions?
  • Can the AI explain how and why it arrived at its decisions?

For example, if the AI presents a pool of potential candidates for an open position, a user should be able to understand why it selected a certain set of people and why it excluded others.

2. Data security and ownership

HR leaders may find a third-party vendor has granted wider access to their company's data than they previously believed.

HR leaders should clarify data questions with their third-party vendor, Howard said. Questions may include the following:

  • Where is the data stored?
  • Who has access to that data?
  • Who owns that data?

For example, a company's organizational data may be fed back into the AI model, which is then learning from it.

"How is your proprietary data being protected, not just from [hackers] but also from an ownership prospective?" Howard said.

3. Data governance

Because AI technology is relatively new, an organization may not yet have put proper data safeguards in place.

HR departments handle a large amount of sensitive data, so HR leaders must consider AI governance, said Sudeep Kumar, head of enterprise data and analytics at Ciena, an optical and routing systems provider located in Hanover, Md.

"Most organizations' data policies and procedures don't have AI governance embedded into them because it's a new thing," Kumar said. "We need to make sure that when we bring in new technologies, we are not breaking compliance."

The HR leader, the chief data officer, the chief information security officer and the company's legal counsel must collaborate on this topic so the organization avoids privacy issues, Kumar said.

4. Candidate experience

Chatbots can potentially save HR staff time by answering candidate questions and addressing them more quickly than a recruiter can.

[HR leaders should] come in with a little skepticism toward some of these tools.
Jennifer Selby LongLeadership and organizational transformation consultant

However, AI can also negatively affect the candidate experience. For example, some candidates may prefer speaking with a human being.

"I would be much more likely to join an organization if I could speak to the hiring manager and ask questions about the culture," Howard said. "Even if that chatbot is amazing and I can't necessarily differentiate between it and a person, there is something to be said for still having the human touch."

In addition, HR and other leaders must make it a priority to inform job candidates about how they're using AI.

Failing to do so could create trust issues with job candidates, Howard said.

5. Pressure to use AI

AI is currently everywhere in the tech world, and HR leaders may start using AI, or face pressure from other leaders at the company to do so, without considering whether it's the best solution to a problem.

Headshot of Jennifer Selby Long, CEO of Selby GroupJennifer Selby Long

HR should start with the question "What problem are we trying to solve," said Jennifer Selby Long, CEO of Selby Group, a Berkeley, Calif.-based consulting firm focused on leadership and transformation.

For example, using AI during video interviews may seem helpful, but establishing a personal connection with candidates is an essential part of the recruiting process, Long said. Using AI may hinder that if, for example, the technology makes a candidate uncomfortable during an interview.

AI currently comes with uncertainty and risk, Long said.

"[HR leaders should] come in with a little skepticism toward some of these tools," she said.

Dig Deeper on Talent management