On Musk’s Twitter, users looking to sell and trade child sex abuse material are still easily found

Twitter accounts that offer to trade or sell child sexual abuse material under thinly veiled terms and hashtags have remained online for months, even after CEO Elon Musk said he would combat child exploitation on the platform.
“Priority #1,” Musk called it in a Nov. 20 tweet. He’s also criticized Twitter’s former leadership, claiming that they did little to address child sexual exploitation, and that he intended to change things.
But since that declaration, at least dozens of accounts have continued to post hundreds of tweets in aggregate using terms, abbreviations and hashtags indicating the sale of what Twitter calls child sexual exploitation material, according to a count of just a single day’s tweets. The signs and signals are well known among experts and law enforcement agencies that work to stop the spread of such material.
The tweets reviewed by NBC News offer to sell or trade content that is commonly known as child pornography or child sexual abuse material (CSAM). The tweets do not show CSAM, and NBC News did not view any CSAM in the course of reporting this article.
Some tweets and accounts have been up for months and predate Musk’s takeover. They remained live on the platform as of Friday morning.
Many more tweets reviewed by NBC News over a period of weeks were published during Musk’s tenure. Some users tweeting CSAM offers appeared to delete the tweets shortly after posting them, seemingly to avoid detection, and later posted similar offers from the same accounts. Some accounts offering CSAM said that their older accounts had been shut down by Twitter, but that they were able to create new ones.
According to Twitter’s rules published in October 2020, “Twitter has zero tolerance towards any material that features or promotes child sexual exploitation, one of the most serious violations of the Twitter Rules. This may include media, text, illustrated, or computer-generated images.”
In an email to NBC News after this article was published, Ella Irwin, Twitter’s vice president of product overseeing trust and safety, said “We definitely know we still have work to do in the space, and certainly believe we have been improving rapidly and detecting far more than Twitter has detected in a long time but we are deploying a number of things to continue to improve.” Irwin asked that NBC News provide the findings of its investigation to the company so that it could “follow up and get the content down.”
It’s unclear just how many people remain at Twitter to address CSAM after Musk enacted several rounds of layoffs and issued an ultimatum that led to a wave of resignations. Musk has engaged some outside help, and the company said in December that its suspension of accounts for child sexual exploitation had risen sharply. A representative for the U.S. child exploitation watchdog the National Center for Missing and Exploited Children said that the number of reports of CSAM detected and flagged by the company remains unchanged since Musk’s takeover.

Nbcnews

Tagged , , , , , , , ,