NY Times Report Finds Youtube’s Algorithm May Have Driven Users to Videos That Sexualize Children
Youtube could be encouraging pedophilia through its automated recommendation system, according to a New York Times report.
The algorithm used to drive billions of views on the platform suggests what users should watch next, and has been found to recommend videos of children to users right after they watch sexually themed content, according to the Times.
Users don’t even need to search for videos with children to come across them– users who watch erotic videos may be recommended videos of women who look noticeably younger, then women who may be wearing children’s clothes, and eventually they end up with recommendations of innocent home videos, girls as young as 5 or 6 playing by a backyard pool in their bathing suits.
In February, after news outlets reported that predators had been using the comment section of Youtube videos with children to guide other pedophiles, the video platform disabled comments on many videos with children in them.
But Youtube has not implemented the vital change needed to stop aiding pedophiles in finding innocuous home videos of scantily clad minors: disabling the recommendation system on videos of children. These recommendations are the biggest traffic driver, responsible for 70% of views. To turn them off, they say, would hurt content creators who rely on those clicks, even though YouTube could in theory automatically isolate and disable recommendations on videos with children.
Some studies have found that YouTube’s recommendation system has a “rabbit hole effect,” or that it leads viewers to progressively more outrageous content to keep users hooked. Researchers at Harvard’s Berkman Klein Center for Internet and Society found that when they followed recommendations on videos of women discussing sex, they were then led to videos of very young women in underwear or breastfeeding, and eventually Youtube started recommending a stream of partially clothed children in Latin America and Eastern Europe.
Some of the videos include links to the minors’ social media accounts, which unlocks a whole slew of predatory dangers. YouTube, however, does not allow anyone under the age of 13 to have a channel, and says that it strictly enforces that policy.
Max Fisher, a Times reporter who co-wrote the piece, followed-up with this:
YouTube’s algorithm also changed immediately after we notified the company, no longer linking the kiddie videos together.
Strangely, however, YouTube insisted that the timing was a coincidence. When I pushed, YT said the timing might have been related, but wouldn’t say it was.
— Max Fisher (@Max_Fisher) June 3, 2019
Have a tip we should know? [email protected]