{"id":27887,"date":"2024-05-31T04:15:37","date_gmt":"2024-05-31T09:15:37","guid":{"rendered":"https:\/\/ustower.net\/?p=27887"},"modified":"2024-05-31T04:15:44","modified_gmt":"2024-05-31T09:15:44","slug":"ai-programs-can-easily-impersonate-biden-others-to-manipulate-elections-study","status":"publish","type":"post","link":"https:\/\/ustower.net\/?p=27887","title":{"rendered":"AI programs can easily impersonate Biden, others to manipulate elections: study"},"content":{"rendered":"\n<p class=\"has-medium-font-size\">It\u2019s easy for artificial intelligence programs to create mimic voices of politicians like&nbsp;<a href=\"https:\/\/thehill.com\/people\/joe-biden\/\"><u>President Biden\u2002<\/u><\/a>and former&nbsp;<a href=\"https:\/\/thehill.com\/people\/donald-trump\/\"><u>President Trump,<\/u><\/a>&nbsp;posting the risk of a rise in voter misinformation, according to a study from the Center for Countering Digital Hate (CCDH)&nbsp;<a href=\"https:\/\/www.scribd.com\/document\/737752199\/Attack-of-the-Voice-Clones-REPORT-EMBARGOED-1\"><u>released Friday<\/u><\/a>.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">AI-enabled tools created convincing false statements using the mimic voices about 80 percent of the time, CCDH tests found.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">\u201cGuardrails for these tools are so severely lacking \u2014 and the level of skill needed to use them is now so low \u2014 that these platforms can be easily manipulated by virtually anyone to produce dangerous political misinformation,\u201d CCDH CEO Imran Ahmed said in a statement.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">Mimic voices have already been used to influence voters in the 2024 election. During the New Hampshire Democratic primary in February, robocalls using a fake Biden voice&nbsp;<a href=\"https:\/\/thehill.com\/policy\/technology\/4424803-fake-biden-robocall-tip-of-the-iceberg-for-ai-election-misinformation\/\"><u>told voters to stay home<\/u><\/a>&nbsp;in an attempt to decrease voter turnout.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">Steve Kramer, who ran the scheme, said he was inspired by a need to warn the public over the dangers of AI. Last week, was&nbsp;<a href=\"https:\/\/thehill.com\/policy\/technology\/4681403-joe-biden-fake-robocall-new-hampshire-political-consultant-indicted\/\"><u>charged<\/u><\/a>&nbsp;with 13 counts each of felony voter suppression and misdemeanor impersonating a candidate. He was also fined $6 million by the Federal Communications Commission (FCC).<\/p>\n\n\n\n<p class=\"has-medium-font-size\">The FCC&nbsp;<a href=\"https:\/\/thehill.com\/policy\/technology\/4442156-fcc-targets-ai-generated-robocalls-after-biden-primary-deepfake\/\"><u>banned the use<\/u><\/a>&nbsp;of AI voices in phone calls after the New Hampshire primary incident, and the committee\u2019s chair moved to require television ads to disclose the use of AI last week.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">\u201cAs artificial intelligence tools become more accessible, the commission wants to make sure consumers are fully informed when the technology is used,\u201d FCC Chair&nbsp;<a href=\"https:\/\/thehill.com\/people\/jessica-rosenworcel\/\"><u>Jessica Rosenworcel\u2002<\/u><\/a>said in a statement last week. \u201cToday, I\u2019ve shared with my colleagues a proposal that makes clear consumers have a right to know when AI tools are being used in the political ads they see, and I hope they swiftly act on this issue.\u201d<\/p>\n\n\n\n<p class=\"has-medium-font-size\">The CCDH study found that few of the six AI tools it tested \u2014 ElevenLabs, Speechify, PlayHT, Descript, Invideo AI, and Veed \u2014 have any built-in safeguards to protect against generating political disinformation.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">The group tested the tools on a plethora of politicians\u2019 voices, including Biden and Trump, as well as foreign leaders such as UK Prime Minister Rishi Sunak and French President Emmanuel Macron.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">Examples of the generated messages included Trump warning people not to vote because of a bomb threat, Biden claiming to have manipulated election results, and Macron \u2018confessing\u2019 to the misuse of campaign funds, CCDH said.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">Only one of the tools, ElevenLab, blocked the production of mimic statements using U.S. and UK politicians\u2019 voices, CCDH found.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">\u201cAI tools radically reduce the skill, money and time needed to produce disinformation in the voices of the world\u2019s most recognizable and influential political leaders,\u201d Ahmed said. \u201cThis could prove devastating to our democracy and elections.\u201d<\/p>\n\n\n\n<p class=\"has-medium-font-size\">\u201cThis voice-cloning technology can and inevitably will be weaponized by bad actors to mislead voters and subvert the democratic process,\u201d he continued. \u201cIt is simply a matter of time before Russian, Chinese, Iranian and domestic anti-democratic forces sow chaos in our elections.\u201d<\/p>\n\n\n\n<p class=\"has-medium-font-size\">AI is \u201csupercharging\u201d threats to the election system, technology policy strategist Nicole Schneidman&nbsp;<a href=\"https:\/\/thehill.com\/homenews\/campaign\/4557961-deepfakes-raise-alarm-about-ai-in-elections\/\"><u>told The Hill in March<\/u><\/a>. \u201cDisinformation, voter suppression \u2014 what generative AI is really doing is making it more efficient to be able to execute such threats.\u201d<\/p>\n\n\n\n<p class=\"has-medium-font-size\">AI-generated political ads have already broken into the space with the 2024 election. Last year, the Republican National Committee released an&nbsp;<a href=\"https:\/\/thehill.com\/homenews\/campaign\/3971120-rncs-ai-generated-biden-attack-ad-puzzles-pundits-democrats\/\"><u>entirely AI-generated ad<\/u><\/a>&nbsp;meant to show a dystopian future under a second Biden administration. It employed fake but realistic photos showing boarded-up storefronts, armored military patrols in the streets and waves of immigrants creating panic.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">In India\u2019s elections, recent AI-generated videos misrepresenting Bollywood stars as criticizing the prime minister exemplify a trend tech experts say is cropping up in democratic elections around the world. DDHC noted similar attempts at election influence in the UK, Slovakia and Nigeria.<\/p>\n\n\n\n<p class=\"has-medium-font-size\">The issue has moved some in Congress to act as well. Sens.&nbsp;<a href=\"https:\/\/thehill.com\/people\/amy-klobuchar\/\"><u>Amy Klobuchar\u2002<\/u><\/a>(D-Minn.) and&nbsp;<a href=\"https:\/\/thehill.com\/people\/lisa-murkowski\/\"><u>Lisa Murkowski\u2002<\/u><\/a>(R-Alaska) introduced a bill earlier this year that would require similar disclosures to the FCC proposal when AI is used in political advertisements.<\/p>\n\n\n\n<p class=\"has-medium-font-size\"><strong><a href=\"https:\/\/thehill.com\/policy\/technology\/4694369-ai-programs-easily-impersonate-biden-others-manipulate-elections-study\/\">Thehill<\/a><\/strong><\/p>\n","protected":false},"excerpt":{"rendered":"<p>It\u2019s easy for artificial intelligence programs to create mimic voices of politicians like&nbsp;President Biden\u2002and former&nbsp;President Trump,&nbsp;posting the risk of a rise in voter misinformation, according to a study from the Center for Countering Digital Hate (CCDH)&nbsp;released Friday. AI-enabled tools created convincing false statements using the mimic voices about 80 percent of the time, CCDH tests [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":27888,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5],"tags":[28752,1613,1430,2771,1202,2872],"class_list":["post-27887","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-politics","tag-artificial-intelligence-programs","tag-election","tag-misinformation","tag-president-biden","tag-research","tag-voters"],"_links":{"self":[{"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/posts\/27887","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=27887"}],"version-history":[{"count":1,"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/posts\/27887\/revisions"}],"predecessor-version":[{"id":27889,"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/posts\/27887\/revisions\/27889"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=\/wp\/v2\/media\/27888"}],"wp:attachment":[{"href":"https:\/\/ustower.net\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=27887"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=27887"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ustower.net\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=27887"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}