Musk’s Twitter is Failing to Remove Child Porn, Even Some of the ‘Easiest to Detect and Eliminate’: NY Times Analysis

 
Elon Musk in a suit and tie

AP Photo/ Benjamin Fanjoy, File

Elon Musk vowed to improve how Twitter combatted child porn when he bought the company last year, but according to a new analysis by The New York Times, the platform has not improved — and in several troubling ways, has actually gotten even worse.

“Removing child exploitation is priority #1,” Musk tweeted in November, one of many comments he has made declaring an intention to devote resources to keeping child sexual abuse material (CSAM) from being disseminated and amplified on Twitter.

But CSAM has continued to proliferate on Twitter, according to the Times‘ analysis, “including widely circulated material that the authorities consider the easiest to detect and eliminate.”

The Times created a new individual Twitter account and an “automated computer program that could scour the platform for [CSAM] without displaying the actual images, which are illegal to view,” and conducted a review of what content was available on Twitter, how it was presented, and what kind of actions were taken against it.

“The material wasn’t difficult to find,” the Times reported, and in many cases Twitter was actually promoting it through the platform’s “recommendation algorithm — a feature that suggests accounts to follow based on user activity.”

Twitter’s challenges in combatting this material stemmed from multiple causes, including the loss of vast swaths of employees who had experience with this problem (either because they were fired as part of Musk’s mass layoffs or because they quit after his takeover), and the company’s decision to stop paying for detection software called “Thorn” that had previously been used to help automate responses and takedown actions.

Anti-child abuse groups around the world have been monitoring the chatter on dark web forums, according to the Times, and users there have discussed how they can “easily find” CSAM “while avoiding detection”:

On Jan. 12, one user described following hundreds of “legit” Twitter accounts that sold videos of young boys who were tricked into sending explicit recordings of themselves. Another user characterized Twitter as an easy venue for watching sexual abuse videos of all types. “People share so much,” the user wrote.

“If you let sewer rats in,” said Australia’s online safety commissioner Julie Inman Grant, “you know that pestilence is going to come.”

Twitter hasn’t been a complete free-for-all for child porn, the Times acknowledged, citing reports from the company that it had “suspended nearly 300,000 accounts for violating ‘child sexual exploitation’ policies, 57 percent more than usual,” during the first month after Musk’s takeover, followed by another 404,000 accounts suspended in January.

But the troubling content is still readily available, and the Times report highlighted the recommendation algorithm as an especially nefarious element, promoting images of children who were known abuse victims and appearing in databases of previously identified CSAM used by anti-abuse groups and platform’s monitoring algorithms and staff to identify and take action against such content.

Arguably worse than the CSAM being available on Twitter, the Times program found accounts offering to sell additional content, including accounts that “advertised child rape videos and included links to encrypted platforms” and one that “offered a discounted ‘Christmas pack’ of photos and videos” of “a child who had been abused from about age 8 through adolescence.”

The Canadian Center for Child Protection did a “broader scan” to compare content available on Twitter with the known CSAM in their database, and found “more than 260 hits, with more than 174,000 likes and 63,000 retweets.”

“The volume we’re able to find with a minimal amount of effort is quite significant,” said Lloyd Richardson, the center’s technology director. “It shouldn’t be the job of external people to find this sort of content sitting on their system.”

Representatives for the National Center for Missing and Exploited Children also told the Times that the organization’s relationship with Twitter has “suffered” in the wake of Musk’s takeover, citing the “high level of turnover,” delayed responses, and reduced reports.

Tags:

Sarah Rumpf joined Mediaite in 2020 and is a Contributing Editor focusing on politics, law, and the media. A native Floridian, Sarah attended the University of Florida, graduating with a double major in Political Science and German, and earned her Juris Doctor, cum laude, from the UF College of Law. Sarah's writing has been featured at National Review, The Daily Beast, Reason, Law&Crime, Independent Journal Review, Texas Monthly, The Capitolist, Breitbart Texas, Townhall, RedState, The Orlando Sentinel, and the Austin-American Statesman, and her political commentary has led to appearances on television, radio, and podcast programs across the globe. Follow Sarah on Threads, Twitter, and Bluesky.