By Aaron Kesel
YouTube has been caught in the crosshairs of another scandal, this time the platform is being accused by YouTuber Matt Watson of enabling a softcore pedophile ring in plain sight, Tech Crunch reported.
According to Watson’s Reddit post entitled “Youtube is facilitating sexual exploitation of minors”:
Over the past 48 hours, I have discovered a wormhole into a soft-core pedophilia ring on Youtube. Youtube’s recommended algorithm is facilitating pedophiles’ ability to connect with each-other, trade contact info, and link to actual child pornography in the comments. I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks. I have made a twenty Youtube video showing the process, and where there is video evidence that these videos are being monetized by big brands like McDonald’s and Disney.
This is significant because Youtube’s recommendation system is the main factor in determining what kind of content shows up in a user’s feed. There is no direct information about how exactly the algorithm works, but in 2017 Youtube got caught in a controversy over something called “Elsagate,” where they committed to implementing algorithms and policies to help battle child abuse on the platform. There was some awareness of these soft core pedophile rings as well at the time, with Youtubers making videos about the problem.
I also have video evidence that some of the videos are being monetized. This is significant because Youtube got into very deep water two years ago over exploitative videos being monetized. This event was dubbed the “Ad-pocalypse.” In my video I show several examples of adverts from big name brands like Lysol and Glad being played before videos where people are time-stamping in the comment section. I have the raw footage of these adverts being played on inappropriate videos, as well as a separate evidence video I’m sending to news outlets.
It’s clear nothing has changed. If anything, it appears Youtube’s new algorithm is working in the pedophiles’ favour. Once you enter into the “wormhole,” the only content available in the recommended sidebar is more softcore sexually-implicit material. Again, this is all covered in my video.
One of the consistent behaviours in the comments of these videos is people time-stamping sections of the video when the kids are in compromising positions. These comments are often the most upvoted posts on the video. Knowing this, we can deduce that Youtube is aware these videos exist and that pedophiles are watching them. I say this because one of their implemented policies, as reported in a blog post in 2017 by Youtube’s vice president of product management Johanna Wright, is that “comments of this nature are abhorrent and we work … to report illegal behaviour to law enforcement. Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”1 However, in the wormhole I still see countless users time-stamping and sharing social media info. A fair number of the videos in the wormhole have their comments disabled, which means Youtube’s algorithm is detecting unusual behaviour. But that begs the question as to why Youtube, if it is detecting exploitative behaviour on a particular video, isn’t having the video manually reviewed by a human and deleting the video outright. Given the age of some of the girls in the videos, a significant number of them are pre-pubescent, which is a clear violation of Youtube’s minimum age policy of thirteen (and older in Europe and South America). I found one example of a video with a prepubescent girl who ends up topless midway through the video. The thumbnail is her without a shirt on. This a video on Youtube, not unlisted, and is openly available for anyone to see. I won’t provide screenshots or a link, because I don’t want to be implicated in some kind of wrongdoing.
I want this issue to be brought to the surface. I want Youtube to be held accountable for this. It makes me sick that this is happening, that Youtube isn’t being proactive in dealing with reports (I reported a channel and a user for child abuse, 60 hours later both are still online) and proactive with this issue in general. Youtube absolutely has the technology and the resources to be doing something about this. Instead of wasting resources auto-flagging videos where content creators “use inappropriate language” and cover “controversial issues and sensitive events” they should be detecting exploitative videos, deleting the content, and enforcing their established age restrictions. The fact that Youtubers were aware this was happening two years ago and it is still online leaves me speechless. I’m not interested in clout or views here, I just want it to be reported.
Watson also posted an in-depth video explaining how pedophiles are able to manipulate YouTube’s video recommendation algorithm to redirect a search for “bikini haul” videos, featuring adult women, to exploitative clips of children participating in sexually suggestive behavior — such as posing in front of mirrors and doing gymnastics and “yoga stretching.”
Then there are the innocuous videos with inappropriate comments from pedophiles, including some with timestamps capturing children in compromising positions. Absolutely horrific and sick to know this type of behavior goes on behind the scenes at YouTube and the company does nothing, yet it will delete and demonetize Activist Post or other news outlets on its platform.
Since then, several companies have responded to the now viral video and Reddit post by suspending advertising on the platform, including Nestlé, Epic, and reportedly Disney and McDonald’s.
Nestlé told CNBC that all of its companies in the U.S. have paused advertising on YouTube, while a spokesperson for Epic, maker of the massively popular game Fortnite, said it has suspended all pre-roll advertising. Other companies that confirmed publicly they are pausing YouTube advertising include Purina, GNC, Fairlife, Canada Goose, and Vitacost. Bloomberg and The Wall Street Journal report that Walt Disney Co. and McDonald’s have also pulled advertising on the video hosting website as well.
“Any content – including comments – that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly,” YouTube said in a statement to Tech Crunch.
YouTube also stated that it reported the comments to the National Center for Missing and Exploited Children and will be taking further actions against child exploitation, including hiring more experts on its platform.
This comes after a series of reports two years ago that hateful or extremist views videos on YouTube were being monetized. This of course led to the “adpocalpyse” which also caught alternative grassroots independent media in its grasp, including Activist Post as they reported.
As a result of the adpocalypse scandal, YouTube attempted to appease advertisers, giving them more control over what videos their ads would appear before, and also allegedly enacted more stringent policies for creators. Albeit, this latest scandal proves differently, or that YouTube staff got lazy since then.
YouTube previously said that it would hire at least 10,000 people in 2018 for its surveillance team and was moving faster to shut down inappropriate content.
“We are taking these actions because it’s the right thing to do,” YouTube CEO Susan Wojcicki wrote in a blog post. However, it’s unclear how many people are already part of YouTube’s review team.
Ironically, the issue was first brought to light in 2016 by YouTube user “reallygraceful,” in a now-unavailable video titled “There’s a Pedophile Ring on YouTube.” She received massive pushback, including a hit piece by the BBC‘s Mike Wendling who trashed and refuted her video, Zero Hedge reported.
It’s worth noting that child exploitation on YouTube is absolutely nothing new, and it has even had its own hashtag for quite sometime: #Elsagate, named after a character in the 2013 Disney animated film Frozen. Further, one of the most famous channels for this type of “softcore pedo porn” were videos from a channel known as the “Seven Super Girls,” which is still active at the time of this report.
The sexually suggestive videos of Seven Super Girls were first brought to light by comedian Daniel Tosh in a 2017 segment on Tosh.0. As The Free Thought Project wrote:
Tosh staged a “To Catch a Predator” style spoof on the type of viewers who were watching the seven girls’ videos. You guessed it….pedophiles. Unfortunately, their videos are extremely popular and serve as an indictment to the world’s pedophilic attractions.
The channel has nearly 3,000 videos uploaded. But this is no ordinary children’s show, and without a doubt, 1980’s era censors would have driven themselves mad in an attempt to shut down the perversion.
At first glance, the Seven Super Girls YouTube homepage, arguably, looks like one’s favorite porn site. Each under-18 girl has her own subchannel. To the unwitting, however, the site may look like girls dressed like girls, engaging in activities which girls enjoy — going to camp, hanging with friends by the pool, and playing dress up.
But to a pedophile, the site is a smorgasbord of smut, carefully crafted to serve as eye candy for adults and teenagers to indulge in their child-sex fantasies. After we clicked on the entire list of videos and selected to sort by most popular, it became clear to us at The Free Thought Project, the videos are in no way innocent.
This comes as Europe is trying to impose a draconian law on the Internet dubbed the Upload Filter or ACTA 2 under Article 13, which would be a perfect solution to stop these types of videos from being uploaded. Although Europe’s filter is focused on copyright, maybe YouTube should consider implementing a pedophile filter and taking down videos of children in bathing suits.”
Back further than that in 2013, an article by The Daily Dot exposed the dark underbelly of YouTube’s pedophilia problem. The publication told of the horror story of a 12-year-old girl named Emily who was contacted by a fake talent agency known as Ikon Modeling.
Emily was approached by a man who identified himself as William, a recruiter with the modeling agency. The Daily Dot‘s in-depth investigation later found that the man had contacted other younger girls promising the same, and that his account was active since at least 2012.
Reddit also shut down a forum devoted to sexualized images of minors, the subreddit r/jailbait, in October 2011; however it’s survived by r/legalteens, which posts pictures of girls who are 18, though a few photos of underage teens purportedly fall through the cracks, according to The Daily Dot.
YouTube has said that it has disabled comments on “tens of millions of videos” and removed at least a thousand channels in the last few days since the scandal broke. Matt Watson’s video entitled “Youtube is Facilitating the Sexual Exploitation of Children, and it’s Being Monetized (2019)” has a total of 2,014,261 views at the time of this writing. Meanwhile, YouTube has released a blog update expressing changes to its policies and how they will handle content which “crosses the line.”
In a comment on a video published by wildly popular YouTube star Philip DeFranco, YouTube’s creator relations team said the company’s staff are working “incredibly hard to root out horrible behavior,” and have “reported illegal comments to law enforcement.”
TLDR: Disabled comments on tens of millions of videos. Terminated over 400 channels. Reported illegal comments to law enforcement. pic.twitter.com/zFHFfkX9FD
— Philip DeFranco (@PhillyD) February 21, 2019
If YouTube digs deeper they will find what Tumblr recently found — that pedophiles and predators were sharing usernames to encrypted messengers like Wickr, Telegram WhatsApp and other forms of encrypted apps on open Web platforms (not even the darknet) meeting up and then exchanging explicit content. This led to the Tumblr app being removed from the Apple iOS and Android app stores briefly before being added back, but the blowback caused Tumblr to go the “safe for work” route.
If you wish to no longer support YouTube there are several alternatives, but the user numbers just don’t come close. However, two alternatives you can use are Bitchute.com and DTube (a front end built on the widely popular blogging platform Steemit.com.)
Aaron Kesel writes for Activist Post. Support us at Patreon. Follow us on Minds, Steemit, SoMee, BitChute, Facebook and Twitter. Ready for solutions? Subscribe to our premium newsletter Counter Markets.