{"id":16051,"date":"2023-07-27T05:11:36","date_gmt":"2023-07-27T10:11:36","guid":{"rendered":"https:\/\/ustower.net\/?p=16051"},"modified":"2023-07-27T06:59:35","modified_gmt":"2023-07-27T11:59:35","slug":"ai-chatbots-are-the-new-job-interviewers","status":"publish","type":"post","link":"https:\/\/ustower.net\/?p=16051","title":{"rendered":"AI Chatbots Are The New Job Interviewers"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\" style=\"font-size:28px\"><strong>Chatbots are increasingly being used by companies to interview and screen job applicants, often for blue collar jobs. But like other algorithmic hiring tools before them, experts and job applicants worry these tools could be biased.<\/strong><strong><\/strong><\/h2>\n\n\n\n<p style=\"font-size:28px\">In early June, Amanda Claypool was looking for a job at a fast-food restaurant in Asheville, North Carolina. But she faced an unexpected and annoying hurdle: glitchy chatbot recruiters.<\/p>\n\n\n\n<p style=\"font-size:28px\">A few examples: McDonald\u2019s chatbot recruiter \u201cOlivia\u201d cleared Claypool for an in-person interview, but then failed to schedule it because of technical issues. A Wendy\u2019s bot managed to schedule her for an in-person interview but it was for a job she couldn&#8217;t do. Then a Hardees chatbot sent her to interview with a store manager who was on leave \u2014 hardly a seamless recruiting strategy.<\/p>\n\n\n\n<p style=\"font-size:28px\">\u201cI showed up at Hardees and they were kind of surprised. The crew operating the restaurant had no idea what to do with me or how to help me,\u201d Claypool, who ultimately took a job elsewhere, told&nbsp;<em>Forbes<\/em>.<em>&nbsp;<\/em>\u201cIt seemed like a more complicated thing than it had to be,\u201d she said. (McDonald\u2019s and Hardees didn\u2019t respond to a comment request. A Wendy\u2019s spokesperson told Forbes the bot creates \u201chiring efficiencies,\u201d adding \u201cinnovation is our DNA.\u201d)<\/p>\n\n\n\n<p style=\"font-size:28px\">HR chatbots like the ones Claypool encountered are increasingly being used in industries like healthcare, retail and restaurants to filter out unqualified applicants and schedule interviews with the ones who might be right for the job. McDonalds, Wendy\u2019s, CVS Health and Lowes use&nbsp;<a href=\"https:\/\/www.paradox.ai\/solutions\/restaurant\">Olivia<\/a>, a chatbot developed by Arizona-based $1.5 billion AI startup Paradox. Other companies like L\u2019Oreal rely on&nbsp;<a href=\"https:\/\/www.cnn.com\/2019\/04\/29\/tech\/ai-recruitment-loreal\/index.html\">Mya<\/a>, an AI chatbot developed in San Francisco by a startup of the same name. (Paradox didn\u2019t respond to a comment request about Claypool\u2019s experience.)<\/p>\n\n\n\n<p style=\"font-size:28px\">Most hiring chatbots are not as advanced or elaborate as contemporary conversational chatbots like ChatGPT. They\u2019ve primarily been used to screen for jobs that have a high-volume of applicants \u2014 cashiers, warehouse associates and customer service assistants. They are rudimentary and ask fairly straightforward questions: \u201cDo you know how to use a forklift?\u201d or \u201cAre you able to work weekends?\u201d But as Claypool found, these bots can be buggy \u2014 and there isn\u2019t always a human to turn to when something goes wrong. And the clear-cut answers many of the bots require could mean automatic rejection for some qualified candidates who might not answer questions like a large language model wants them to.<\/p>\n\n\n\n<p style=\"font-size:28px\">That could be a problem for people with disabilities, people who are not proficient in English and older job applicants, experts say. Aaron Konopasky, senior attorney advisor at the U.S. Equal Employment Opportunity Commission (EEOC), fears chatbots like Olivia and Mya may not provide people with disabilities or medical conditions with alternative options for availability or job roles. \u201cIf it&#8217;s a human being that you&#8217;re talking to, there&#8217;s a natural opportunity to talk about reasonable accommodations,\u201d he told Forbes. \u201cIf the chatbot is too rigid, and the person needs to be able to request some kind of exemption, then the chatbot might not give them the opportunity to do that.\u201d<\/p>\n\n\n\n<p style=\"font-size:28px\"><strong>\u201cIt&#8217;s sort of like how Netflix recommends movies based on other movies you like.\u201d<\/strong><strong><\/strong><\/p>\n\n\n\n<p style=\"font-size:28px\">Jeremy Schiff, CEO and founder of RecruitBot<\/p>\n\n\n\n<p style=\"font-size:28px\">Discrimination is another concern. Underlying prejudice in data used to train AI can bake bias and discrimination into the tools in which it&#8217;s deployed. \u201cIf the chatbot is looking at things like how long it takes you to respond, or whether you\u2019re using correct grammar and complex sentences, that&#8217;s where you start worrying about bias coming in,\u201d said Pauline Kim, a employment and labor law professor at Washington University, whose research focuses on the use of AI in hiring tools. But such bias can be tough to detect when companies aren&#8217;t transparent about why a potential candidate was rejected.<\/p>\n\n\n\n<p style=\"font-size:28px\">Recently, government authorities have introduced legislation to monitor and regulate the use of automation in hiring tools. In early July,&nbsp;<a href=\"https:\/\/www.wsj.com\/articles\/new-york-city-starts-to-regulate-ai-used-in-hiring-tools-79a2260f?mod=djemalertNEWS\">New York City<\/a>&nbsp;enacted a new law requiring employers who use automated tools like resume scanners and chatbot interviews to audit their tools for gender and racial bias. In 2020,&nbsp;<a href=\"https:\/\/www.bakersterchi.com\/new-illinois-statute-among-the-first-to-address-aiaided-job-recruiting\">Illinois<\/a>&nbsp;passed a law requiring employers who apply AI to analyze video interviews to notify applicants and obtain consent.<\/p>\n\n\n\n<p style=\"font-size:28px\">Still, for companies looking to trim recruiting costs, AI screening agents seem an obvious option. HR departments are often one of the first places to see staff reductions, said Matthew Scherer, a senior policy counsel for workers\u2019 rights and technology at the Center for Democracy and Technology. \u201cHuman resources has always been a cost center for a company, it&#8217;s never been a revenue generating thing,\u201d he explained. \u201cChatbots are a very logical first step to try and take some of the load off of recruiters.\u201d<\/p>\n\n\n\n<p style=\"font-size:28px\">That\u2019s part of the rationale behind Sense HQ, which provides companies like Sears, Dell and Sony with text messaging-based AI chatbots that help their recruiters wade through thousands of applicants. The company claims it\u2019s already been used by some 10 million job applicants, and co-founder Alex Rosen told&nbsp;<em>Forbes<\/em>&nbsp;such numbers mean a much bigger pool of viable candidates.<\/p>\n\n\n\n<p style=\"font-size:28px\">\u201cThe reason that we built a chatbot in the first place was to help recruiters talk to a wider swath of candidates than they might be able to do on their own,\u201d he said, adding the obligatory caveat: \u201cWe don&#8217;t think that the AI should be making the hiring decision on its own. That\u2019s where it gets dangerous. We just don&#8217;t think that it&#8217;s there yet.\u201d<\/p>\n\n\n\n<p style=\"font-size:28px\">RecruitBot is bringing AI to bear on hiring by using machine learning to sift through a database of 600 million job applicants scraped from LinkedIn and other job marketplaces \u2014 all with the goal of helping companies find job candidates similar to their current employees. &#8220;It&#8217;s sort of like how Netflix recommends movies based on other movies you like,\u201d CEO and founder Jeremy Schiff told&nbsp;<em>Forbes.<\/em>&nbsp;But here too bias is an obvious concern; hiring more of the same has its pitfalls. In 2018,&nbsp;<a href=\"https:\/\/www.reuters.com\/article\/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G\">Amazon<\/a>&nbsp;removed its machine learning-based resume tracking system that discriminated against women because its training data was mostly composed of resumes of men.<\/p>\n\n\n\n<p style=\"font-size:28px\"><strong>\u201cChatbots are a very logical first step to try and take some of the load off of recruiters.\u201d<\/strong><strong><\/strong><\/p>\n\n\n\n<p style=\"font-size:28px\">Matthew Scherer, senior policy counsel at the Center for Democracy and Technology<\/p>\n\n\n\n<p style=\"font-size:28px\">Urmila Janardan, a policy analyst at Upturn, a nonprofit which researches how technologies impact opportunities for people, noted that some companies have also turned to personality tests to weed out candidates \u2014 and the questions to screen candidates may not be related to the job at all. \u201cYou might even be potentially rejected from the job because of questions about gratitude and personality,\u201d she said.<\/p>\n\n\n\n<p style=\"font-size:28px\">For Rick Gned, a part-time painter and writer, a personality quiz was part of a chatbot interview he did for an hourly-wage shelf-stacking job at Australian supermarket, Woolworths. The chatbot, made by AI recruitment firm Sapia AI (formerly known as PredictiveHire), asked him to provide 50- to 150-word answers for five questions and then analyzed his responses, looking for&nbsp;<a href=\"https:\/\/sapia.ai\/products\/interview\/\">traits<\/a>&nbsp;and skills that match the recruiters\u2019 preferences. Concluding that Gned \u201cdeals well with change,\u201d and is \u201cmore focused on the big picture that causes him to look over details,\u201d it advanced him to the next round interview. While Sapia AI does not require applicants to respond to questions under a time limit, the system measures sentence structure, readability and complexity of words used in the text responses, Sapia AI CEO and cofounder Barb Hyman said in an email.<\/p>\n\n\n\n<p style=\"font-size:28px\">Gned found the whole thing dehumanizing and worrisome, he told Forbes. \u201cI\u2019m in a demographic where it doesn\u2019t affect me, but I\u2019m worried for people who are minorities, who predominantly make up the lower-income labor market.\u201d<\/p>\n\n\n\n<p style=\"font-size:28px\">For one job applicant, who requested anonymity to speak freely, chatting with a bot had at least one positive. When filling hundreds of job applications, he often never hears back, but the bot at least assured him that his application was received. \u201cIt did feel like a morale boost in so many ways,\u201d he said. \u201cBut if I had to do this (text with a chatbot) with every job I applied to, it would be a pain in the ass.\u201d<\/p>\n\n\n\n<p style=\"font-size:28px\"><a href=\"https:\/\/www.forbes.com\/sites\/rashishrivastava\/2023\/07\/26\/ai-chatbots-are-the-new-job-interviewers\/?sh=715b1752e3a4\">Forbes<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Chatbots are increasingly being used by companies to interview and screen job applicants, often for blue collar jobs. But like other algorithmic hiring tools before them, experts and job applicants worry these tools could be biased. In early June, Amanda Claypool was looking for a job at a fast-food restaurant in Asheville, North Carolina. But [&hellip;]<\/p>\n","protected":false},"author":7,"featured_media":16052,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5783],"tags":[10217,10221,10218,10219,10220],"class_list":["post-16051","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-sci-tech","tag-barriers","tag-bots","tag-burden-reduction","tag-on-site-interviews","tag-recruiting-strategies"],"_links":{"self":[{"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/posts\/16051","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/users\/7"}],"replies":[{"embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=16051"}],"version-history":[{"count":2,"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/posts\/16051\/revisions"}],"predecessor-version":[{"id":16072,"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/posts\/16051\/revisions\/16072"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/media\/16052"}],"wp:attachment":[{"href":"https:\/\/ustower.net\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=16051"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=16051"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=16051"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}