YouTube #Adpocalypse Reemerges Following Pedophile Ring Scandal

If you’re wondering which adblocker is the greatest threat to the economy of YouTube, it’s actually the YouTube platform themselves. During the week, the online video sharing monopoly was faced with more high-profile advertisement purges after a new series of videos from Matt Watson showcased how anonymous users were abusing the platform’s tools to foster their own underbelly of child exploitation — while making a few bucks from the illegal content in the process.

The story caught mainstream media attention after a report from Bloomberg revealed some of YouTube’s highest advertisers, such as Disney, McDonald’s, Nestle and Epic Games (the developers behind the highly-popular children’s title Fortnite), have decided to “withhold its spending” on the platform following Watson’s breakthrough exposé on the matter. The scandal surrounds how users routinely troll the comments of videos where young girls are shown performing gymnastics, dancing, trying out new clothes and other suggestively provocative material — enabling their creation of a “wormhole for pedophiles” in order to satisfy their monstrous sexual urges.

“How does this exist?” Watson railed — visibly distraught by the platform’s continued algorithmic failures — soon advising his audience to rally under the new online movement #YouTubeWakeUp. “How is it that there are people who are genuine good… where every algorithm under the sun detects when they swear more than two times or make videos about panic attacks and depression… yet this shit is going on? For me, I want nothing to do with this platform that’s supporting this [shit]. It’s been in the public consciousness for over two years and nothing is being done. It’s disgusting.”

Initially, YouTube tried to ignore the scandal by avoiding any comments. It was only a few days later that a spokeswoman for the company released a statement claiming to have terminated the accounts and channels displayed in the video, disabled “violative comments” on this specific content and reported any illegal activity to law enforcement. According to The Verge, these videos only resulted in a collective $8,000 in ad spending, which the platform has promised to refund in due time.

This gesture, however, which amounts to simple content oversight against a select handful of videos, doesn’t address the untouched systemic problems behind the wormhole’s funding while Watson’s video focuses on. The buck doesn’t just stop at the one perverted consumer — it’s the whole black market their site enables. “Any content, including comments, that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” the spokesperson continued to explain. “There’s more to be done, and we continue to work to improve and catch abuse more quickly.”

The fallout response echoes the disaster which emerged after the first #Adpocalypse in 2017 — when The Wall Street Journal published an article about how mainstream advertisements were appearing on the videos of hateful white supremacists, bipartisan violent extremists and child exploitative content known as #ElsaGate. This lead to an advertiser boycott demanding the YouTube platform reform its content oversight and policy enforcement, resulting in more freedom of (dis)association for advertisers and a bureaucratic, blacklisting nightmare for many creators.

Often this dynamic manifested in further restrictions on speech, where economic sanctions are placed on either the coverage of controversial topics to literal dirty words, most of which the company admitted is being handled through the use A.I. moderators. Usually, human judgement is only required once users dispute their claims with the platform manually. We’re also supposed to ignore the inherent conflicts of interest with their market-style kangaroo courts — where disputes are handled by the company’s own administrators that audiences are supposed to trust as good faith actors who merely want to enforce their policies fairly, not employees who have to maintain YouTube’s profits.

“We’ve always used a mix of human reviewers and technology to address violative content on our platform,” the company said in a statement we reported on last year. “In 2017, we started applying more advanced machine learning technology to flag content for review by our teams. This combination of smart detection technology and highly-trained human reviewers has enabled us to consistently enforce our policies with increasing speed.” If we’re to believe YouTube’s own statistics, their process is so efficient that over 75% of their flagged over four months (amounting to 50 million videos) was deleted before receiving a single audience view — all of which conducted by A.I systems — and yet the wormholes remain only until grassroots exposure.

So where does YouTube go from here? Our articles have suggested YouTube further evaluate their administrative codes of conduct, which should be drafted and enforced by an outsourced impartial judgement their site should only fund, showcasing a serious commitment to tackling the economy of child exploitation. YouTube’s current methods under the “trusted flagger” program have this unique way of flagging criticisms of governments such as Israel, Venezuela and the United States where a hub of pedophiles remains dormant. Shouldn’t these issues of genuine illegality require more resources over these continued promise to curate the everchanging meme of “fake news”? Doesn’t ignoring the story for optics sake keep the narrative controlled by censorship tyrants?

It’s easy from them to demand YouTube simply seize the means of content production with an iron fist of oversight or demand advertisers to abandon the sinking ship altogether — as Watson’s policy positions appear from his impassioned videos. It’s easy to demand the media simply ignore these underbelly issues to avoid another marketing fallout — as YouTube creators have argued such Keemstar, the host of DramaAlert, who admitted he was avoiding coverage of the scandal to not “hurt the community”.

What these radical views require is the difficult task of moderation, granting YouTube the overbearing task of deciding how they’ll curb the rise of this abusive dark net gone public while maintaining the rights and liberties of their contributors. When it comes to this obvious child abuse, YouTube should have no reservations about serving justice against pedophiles — even when their lack of checks and balances elsewhere remain a touchy subject for debate.

Thanks for reading! This article was originally published for, a bipartisan media platform for political and social commentary, truly diverse viewpoints and facts that don’t kowtow to political correctness.

Bailey Steen is a journalist, graphic designer and film critic residing in the heart of Australia. You can also find his work right here on Medium and publications such as Janks Reviews.

For updates, feel free to follow @atheist_cvnt on his various social media pages on Facebook, Twitter, Instagram or Gab. You can also contact through for personal or business reasons.

Stay honest and radical. Cheers, darlings. 💋

troubled writer, depressed slug, bisexual simp, neoliberal socialist, trotskyist-bidenist, “corn-pop was a good dude, actually,” bio in pronouns: (any/all)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store