kantver - Fotolia
Your competitors are amassing armies of recruiting chatbots
AI is the new digital divide. Bots screen résumés in a fraction of the time, and emotion analytics weeds out corporate killjoys. So why rely on humans to make hiring decisions?
Ever since a girl in high school told me that I should smile more often, I've wondered if my normal facial expression conveys a misconceived anger or annoyance that I'm really not feeling inside.
But I was reassured when the AffdexMe emotion analytics smartphone app saw no emotion on my face until I affected a wide-eyed expression and my fear score topped 50%. Forcing a smile earned me a 100% joy rating, because smiling always means pure joy to me. I never figured out how to fake anger, but I did learn that curling one side of my upper lip raises the contempt score.
AffdexMe, from Affectiva, is one of a growing number of applications, such as emotion analytics and recruiting chatbots, that use artificial intelligence to automate the recruitment process.
It's fun to be cynical about how chatbots and AI, especially machine learning, could turn an already dehumanizing and weird process into a Blade Runner-esque experience that's dehumanizing in a lot weirder ways. (Olivia, the recruiting chatbot from Paradox.ai and "the most knowledgeable person in your company," might disagree in her know-it-all way.)
Minimizing the grunt work
But the early stages of the recruitment process that these tools target is undoubtedly ripe for automation in general -- specifically basic information processing, filtering and decision-making. Going through hundreds of résumés to screen candidates is time-consuming and error-prone, which is why vendors of recruiting chatbots can credibly claim to do the job in one-tenth the time it takes humans to perform the same task.
Next comes the comparably tedious back-and-forth communication with candidates to request more information or schedule interviews. Recruiting chatbots are perfect for this job, too, in the same way as the chatbots consumers encounter on e-commerce sites.
Mya, "your team's AI recruiter," according to Mya Systems, can handle these interactions with endless enthusiasm, using natural language processing to make sense of an applicant's typed questions and respond with accurate, realistic answers. Mya also knows how to work with your rickety applicant tracking system better than your people probably do.
Having mechanized these repetitive, manual early steps, the makers of recruiting chatbots then hand off candidates who meet basic criteria so human recruiters can decide which ones to call for an interview or a behavioral test. Entelo, which makes the Envoy recruiting automation platform that launched in January 2018, claimed recruiters at beta testers advanced 80% of the candidates identified by Entelo.
It explains why the chatbot market is taking off and why major companies like Hilton to use AI in cloud-based human capital management systems to speed résumé-sifting and other HR processes. Other chatbot entrants include Rai from HiringSolved, TextRecruit's IBM Watson-based Ari and Karen from Karen.ai.
What's in the black box?
The other early applications of AI in recruiting seem on shakier ground technically and ethically, which is not to diminish their potential or capabilities. And like recruiting chatbots, tools from vendors like HireVue, maker of a prominent video interviewing platform and early user of AI-based emotion analytics, also promise to minimize bias in hiring.
The idea is that algorithms are more likely to focus on people's qualifications and will not be influenced by irrelevancies that distort the judgment of human interviewers, such as having similar hobbies or a diploma from a prestigious university. HireVue claimed it can improve the success prediction rate of pre-hire assessments -- which are often done in an electronic questionnaire -- by applying AI analytics to short video interviews.
While making faces at AffdexMe, I wondered about its potential to help applicants game these recruitment systems. Couldn't I train myself to hide my fear and desperation if I'm asked to divulge my biggest weakness and my instinct to be honest wrestles with my common sense to lie and wrap a weakness within a strength, such as "I work too hard" or "I care too much"? It has already been done at one site, which offers tips on using AffdexMe to ace a behavioral interview through expressive suppression.
Removing bias from the equation is a noble and practical goal, but one wonders about the means, not the end. As the McKinsey & Company McKinsey Global Institute asked about AI in general, how do you keep sexism, racism and other biases out of the data that's fed to algorithms? What rights do people have to see into the AI black box and learn how it reached its conclusions? Who is responsible for decisions informed by AI?
Recruiting chatbots raise other ethical technological issues, such as whether to tell applicants they're not interacting with a real person. Configuring the chatbots requires someone to choose the criteria that matter to the hiring organization. Who's to say, for example, that they won't secretly filter out older workers and the disabled?
What's more, job candidates will be hard-pressed to find a chatbot that can exchange pleasantries while doing the document-swapping and scheduling grunt work on their behalf. Maybe they need (ahem) a bot named "Proletariat.ai."
I had high hopes for Wade, "your personal career guide," and co-worker of Wendy, another recruiting chatbot. When Wade asked for my LinkedIn profile, I thought I was in. Alas, Wade is still in the beta stage. He says I'm on the waitlist, Group 21, whatever that is; I'm picturing a space station with a tramway that whisks small groups of people off to some unknown place. Wade looks forward to meeting me and promises to join me on my career adventure very soon. Come on, Wade, bro to bro. Take another look at my AffdexMe screen. I don't have forever.