Hiring vendor says gender-based AI bias is pervasive

Gender bias in AI algorithms for recruiting and hiring is a big problem, according to the head of Job.com. With a new hire, the company is planning to change that.

AI bias is widespread in recruiting and hiring platforms, argues Arran Stewart, head of Job.com. There's no disagreement from analysts about the pervasiveness of gender-based AI bias. What they will question is Stewart's claim that he can do something about it.

Job.com, a job matching platform, said it is focusing on eliminating gender bias in the AI platform. The company has set a deadline to complete the work in 2021. 

Stewart, co-founder and chief visionary officer for the Austin, Texas, company, said AI bias is an industry-wide problem.

"I will openly admit, and maybe others won't," Stewart said, "that all artificial intelligence for AI recruitment matching will have bias in it. It is impossible for it not to have it."

Stewart blames gender bias, in part, on how machine learning is trained on data sets from occupations that are male dominated, such as tech.

"When the artificial intelligence is learning from a data set of resumes that is predominantly male, guess what? It will only look for male candidates," Stewart said.

Stewart has a plan and an approach for tackling the problem. But can he fix it?

Holger Mueller, analyst at Constellation Research Inc. in Cupertino, Calif., said gender-based AI bias is a problem that no one has solved, and, "I think it is a little arrogant to put a solution date out there."

When the artificial intelligence is learning from a data set of resumes that is predominantly male, guess what? It will only look for male candidates.
Arran StewartChief visionary officer, Job.com

Part of the difficulty is that language and behavior continually evolve, and what is true in 2019 may not be the case in 2021, Mueller said.

Mueller, nonetheless, believes that finding bias in language is an opportunity for vendors that can do it successfully on a continuous and automated basis.

Job.com's platform automates job matching. An employer pays if a match leads to a successful hire. The platform charges 7% of the employee's base salary. The platform keeps 2% and turns over the balance to the candidate. The average payout to candidates is about $3,500, Stewart said.

The algorithmic systems don't distinguish masculine and feminine styles of writing, Stewart said. Men may use words such as focused and capable, while women might use words such as understanding and appreciation, he said.

The gender-based AI bias may be subtle, affecting a candidate's ranking by only 5%. But this means someone who should have been at the top of a candidate pool, might be second, Stewart said.

Job.com hired Laurie Boschini, a software engineer from Search Technologies, to lead the effort. She will head Job.com's research and development 12-member team, the firm said Thursday.

Stewart doesn't deny that AI bias is a difficult problem to solve. He points to Amazon's reported decision to abandon a recruiting software development effort because of gender bias issues.

The Amazon algorithm kept recommending men over women because the algorithm, trained using a biased data set, discounted female resumes, said Ben Eubanks, an analyst at Lighthouse Research & Advisory.

Eubanks pointed to firms such as Textio and SAP that help employers write job ads free of bias. They help "employers use more neutral terms to balance the applicant pool for their openings," he said. 

Eubanks said he is unaware of another job board or resume database that is trying to do what Job.com is doing. But interest in job ads free of gender bias makes clear "that this is a proven market that employers want to solve," he said.

Next Steps

A manager's political leanings could lead to hiring bias

Democrats push for AI bias testing

Dig Deeper on Core HR administration technology