“I have to tell you, our son is gone.”
These are words no parent should ever have to hear or speak. But this is what one father had to tell his wife over the phone one evening.
Their teenage son had died by suicide after being sexually exploited on Instagram. The exploiters had used surreptitiously obtained nude images to blackmail him, demanding money or else they would post those images online for all to see. They also threatened to hurt or kill his parents.
“You might as well end it now,” they had told him at one point.
And that’s what the boy did.
This devastating true story, from which the names have been left out of the privacy of the family, is unfortunately all too common. It illustrates the horrific reality of the dangers children face when they are on social media and other tech platforms.
Tragically, a rapidly growing number of children have died or been deeply traumatized due to social media apps that are dangerous by their very design. When online, children are facing sexual extortion threats, unwanted sexual advances, and harmful content aggressively fed to them by powerful algorithms and features that actually enable predators in finding and connecting with minors. It is hard to imagine another time in history when children can so easily be groomed to be sexually exploited online.
The Internet Watch Foundation has recently lamented that “there has never been a worse time to be a child online.” Its hotline has been finding record amounts of child sex abuse material (the more apt term for child pornography) and “shocking” numbers “like never before” of exploited children.
The National Center for Missing & Exploited Children received more than 32 million reports of child sexual abuse content online in 2022. And the FBI has been issuing repeated warnings about the growing threat of sexual extortion.
Big Tech knows the dangers to kids on their platforms, but it has been slow to fix them or outright ignores them. This puts children’s well-being at grave risk, just to increase their bottom line.
Meta documents about child safety were unsealed as part of a lawsuit brought by New Mexico’s Attorney General. They reveal that Meta estimated back in 2021 that 100,000 children per day received sexually explicit content, such as photos of adult genitalia.
During a congressional hearing last November, whistleblower Arturo Bejar, a former Meta security consultant, shared that Meta only acts on 2 percent of user reports, which can include reports of sexual harassment or sexual advances from strangers on Instagram. The Wall Street Journal reported that one in eight users under age 16 “said they had experienced unwanted sexual advances on [Instagram] over the previous seven days.”
Other platforms are no better. In October 2023, Discord announced that it would proactively blur sexually explicit content for teen users as the default, becoming the first social media company to say it would do so. But months later, these changes have yet to be implemented. Discord also deceptively claims that it proactively detects and removes child sexual abuse material, but according to the eSafety Commissioner’s transparency report, Discord does not monitor or provide an in-app reporting mechanism across all areas of its platform.
Snapchat’s unique disappearing technology has made it a favorite platform for sexual predators to exploit young users. It has been cited as the number one platform used for sexual extortion. And while Snap has made some platform changes, none of them address the feature most used to exploit.
X (formerly Twitter) was just flagged in parent control app Bark’s annual report as second only to Kik for “severe sexual content.”
And there are concerns about TikTok’s effectiveness in moderating and reporting child sexual abuse material. According to Thorn, TikTok was tied with X as one of the top five platforms where the greatest number of minors reported having had an online sexual experience (along with Snapchat, Facebook, Instagram, and Messenger).
CEOs from Meta, Snap, X, Discord, and TikTok will testify before Congress on Jan. 31 about how their platforms facilitate child sexual abuse material. These executives will testify that they are doing all they can to stop child sexual abuse. We will most assuredly hear more lies and excuses.
The truth is that, despite all the tools and so-called safety changes they will parade and promote, more and more children are being harmed on their platforms. Big Tech has proven either unwilling or incapable of keeping kids safe online. And whichever one of these it is, we should all be terrified.
Big Tech cannot be relied upon to do the right thing voluntarily. It is long past time for Congress to hold these companies accountable, and to pass legislation that will force Big Tech to protect our children in earnest.
Congress has before it several child protection bills. Despite the growing evidence of harm to children online, Congress has yet to pass any legislation to prevent these harms.
We have all heard enough excuses. Children are being threatened, harassed, and exposed to extreme harms online, to the point where an alarming number believe that ending their lives is the only option. If this fact alone doesn’t compel Congress to impose even the most basic protections, then I fear nothing will.
Lina Nealon is vice president and director of corporate advocacy for the National Center on Sexual Exploitation.