Home Christian News You Need to Know How Pedophiles Are Exploiting Children on YouTube

You Need to Know How Pedophiles Are Exploiting Children on YouTube

Parents take warning: if your kids are uploading videos of themselves to YouTube, they could be unwittingly attracting pedophiles. YouTube has found itself in hot water this week after YouTuber Matt Watson posted a video on February 17th detailing how pedophiles are exploiting the platform with little retribution [Note: this video contains language and content some may find offensive and/or disturbing].

Many of the videos that have been exploiting children were monetized, and multiple major brands have pulled their advertising from YouTube for the time being. A spokesperson for Grammarly told WIRED,

We’re absolutely horrified and have reached out to YouTube to rectify this immediately. We have a strict policy against advertising alongside harmful or offensive content. We would never knowingly associate ourselves with channels like this.

What Are Pedophiles Doing?

In his video, which currently has over 2 million views, Watson explains that what pedophiles are doing on YouTube is finding innocuous videos of children, typically little girls. The girls might be sitting and talking, doing gymnastics, or playing outside. Then the predators either comment on the original videos or take parts of the originals and create their own.

In the comments beneath these videos, the pedophiles will put timestamps, directing viewers to “moments of interest” in each video–that is, moments where the children are in positions that could be construed as sexual. Users will leave suggestive comments under the videos, saying that a girl is a “goddess,” “beautiful,” and so on. Those are by far the most mild comments, according to what WIRED found.

Watson created his video out of outrage, not merely that YouTube is tolerating this activity, but that the company is actually facilitating it.

It’s no secret that YouTube is a consumer-driven platform. If people watch a video on one topic, when the video is finished, the platform will show related videos users might be interested in watching next. If users don’t choose one or hit “cancel,” YouTube will autoplay one it thinks the viewer will find interesting. The purpose of this, of course, is to get people to stay on the platform longer, thus making YouTube more money.

YouTube also shows related videos in the sidebar and in its main page under “Recommended” videos. The result is, Watson emphasizes, that once people find videos with activity from pedophiles, those are the only kinds of videos YouTube will show them. He says, “Once you enter into this wormhole, now there is no other content available…for whatever reason, YouTube’s algorithm is glitching out to a point that nothing but these videos exist.” This setup enables pedophiles to find more, similar videos, network with each other online, and find child pornography.

YouTube went through a similar controversy with “Elsagate” in 2017. After that, the company was supposedly cracking down on inappropriate content that targeted children.

YouTube published a blog in November of 2017 titled, “5 ways we’re toughening our approach to protect families on YouTube and YouTube Kids.” The authors write, “Starting this week we will begin taking an even more aggressive stance by turning off all comments” that are “inappropriate sexual or predatory comments on videos featuring minors.”

But Watson has found videos of children that have the comments disabled. What this means is that YouTube is aware there was abusive content against minors (otherwise the comments wouldn’t be disabled), but did nothing more about it. The company didn’t remove the videos or block the accounts of the people who posted them. Instead, the platform continued to allow its algorithm to recommend related videos of little girls in the sidebar.  

“How has YouTube not seen this?” Watson asks.

Watson says that when he pointed out actual child pornography to YouTube, YouTube deleted it. But the company still allowed the accounts associated with those links to exist.

It’s important to be aware that it is possible to run across videos promoted by pedophiles in a variety of ways. It could be through looking for sexually suggestive videos of women, or it could be through searching something more innocuous, such as terms related to “gymnastics,” “leotards,” or “yoga.”

Watson also points out that predators are flaunting the fact that they are getting away with their behavior, whether by putting the word “pedophile” in their usernames or by using sexually explicit emojis. Watson says he has reported channels with clear evidence of pedophilic activity and YouTube has not deleted them.

Toward the end of his video, Watson says he’s writing up a report and sending it to every news outlet he can think of in the hopes that someone will do something about what he has found. His strategy seems to be working so far. Various major news outlets, such as the New York Times, CNBC, and Bloomberg are reporting on this story.

Watson says he is done supporting YouTube and that he never wants to post another video on the platform again. His hope is that something will finally be done and the pedophiles will be held accountable:

“It’s been in the public consciousness for over two years, and yet nothing’s being done. I’m disgusted.”