film / tv / politics / social media / lists / web / celeb / pajiba love / misc / about / cbr
film / tv / politics / web / celeb

elsagate-youtube-kids.jpeg

ElsaGate and the Very Real Dangers of YouTube Kids

By Dustin Rowles | Streaming | January 26, 2018 |

By Dustin Rowles | Streaming | January 26, 2018 |


elsagate-youtube-kids.jpeg

YouTube and its relationship with kids is familiar terrain for us here at Pajiba. I wrote about the bizarre unboxing phenomenon a couple of years ago, and Courtney hilariously wrote about how granting her daughter access to YouTube was essentially like handing her a porn machine. The videos available to young children on YouTube Kids — while often innocuous in the eyes of children — are downright disturbing to parents, as Courtney wrote about in her experiences:

There is a sizable community of Anna/Elsa shippers, most notably on Reddit’s Elsanna community. The content therein can be rather graphic, or it can be fairly innocent. As innocent as sisters fucking can be.

This video, her beloved “flip-flops,” wasn’t terribly explicit. It was just Anna and Elsa sweetly “practicing kissing” in their tiny thong bikinis. You know, like sisters do. But the sudden rush of realization, the impetus for flip-flop time, naked Anna/Elsa hugs and the idea that my child was gleefully watching something created for the express purpose of jerk-off fuel, it was a lot. I took the phone away. And she sobbed hysterically.

“Jules, I don’t think this is a very good video.”
“IT IS! *sob* IT IS A GOOD VIDEO!”
“That’s not how sisters act, babe.”
“IT IS! *sob* IT IS HOW SISTERS ACT!”

I took away the softcore computer animation porn she’d come to love so much. I was a monster.

As Petr brought to my attention this morning, there’s actually a name for these controversial videos: Elsagate, described by Wikipedia thusly: “Elsagate is a neologism referring to the controversy surrounding supposedly child-friendly videos on YouTube and YouTube Kids which contain themes inappropriate for children.”

The problem is widespread enough that there are Reddit communities designed to rid YouTube of these videos. Last November, as we wrote, YouTube finally took action in an effort to eradicate these videos from their service, but it’s clear that they’ve only made a small dent toward ridding its service of videos where, for instance, Peppa Pig can be seen drinking bleach or, well, videos like this:

Screen Shot 2018-01-26 at 8.55.46 AM.jpg

Elsagate, however, is more than just about the existence of these videos, but the reasons behind why they exist. Obviously, money is a factor for many, and having a toddler with access to YouTube is tantamount to having your own little automated bot that indiscriminately pecks and clicks at videos into the millions of views.

In fact, before I was aware of what was being published on YouTube, I let my twin daughters have access to YouTube Kids for a brief time, reasoning that anything on a service marketed toward “Kids” would be safe for them. Even before I discovered the hazards inherent in the service, however, I had taken away access because my daughters kept coming across Finger Family videos — obnoxious videos where fingers are used as puppets accompanied by a grating, infectious nursery rhyme that is still stuck in my damn head — and there’s a reason why they kept running across them: Because there are 17 million of those Finger Family videos. 17 million. And my daughters had probably clicked on 16 million of them.

Warning: DO NOT CLICK PLAY.

But those were child’s play compared to the millions of videos that aren’t exactly porn, but ultimately normalize porny behavior — peeing, pooping, getting undressed, kissing your sister, kissing your undressed sister, kissing your undressed sister while she’s pooping. etc.

Elsagate conspiracists contend that a lot of these videos are specifically designed by adults to make children more comfortable with the actions of pedophiles, as though grooming them for their inevitable molestations. But the rate at which these videos are being generated suggest that they are most likely being created by bots using kid-friendly algorithms, and that’s a problem just as insidious to children.

Blindly using algorithms to create content for kids is a dangerous path. It’s mixing and matching popular ideas from videos across the spectrum. A potty-training video set to a nursery rhyme may seem innocuous, and there’s nothing particularly nefarious about Anna and Hans sharing a discrete kiss, but the algorithmic combination of those two ideas results in a man kissing a naked toddler on the potty set to a nursery rhyme. It’s how you end up with strange combinations like “Disney Pregnant Selfie Slumber Party,” which sounds less like a kids’ video and more like a fetish.

Indeed, the algorithmic combinations results in a number of bizarre situations involving dismemberment or murder or urine drinking or intentional or inadvertent pornographic acts perpetuated by beloved Disney or PBS characters. And listen: Seeing Aladdin commit Hari Kari on the john is gonna fuck with your kids’ psyches real bad.

All of which is just another reminder to parents to remove YouTube from your devices. Don’t sit your kid down with a pirated episode of “Daniel Tiger” on YouTube to go make dinner, because the second you look away, she’s going to have clicked her way to a video of a belly-inflated Troll doll cannibalizing Barbie while Ken watches from the other side of the room while removing his appendages. Until YouTube figures its shit out, stick with streaming services that have been curated for your kids: Netflix Kids, Hulu, or PBS Kids.





Dustin is the founder and co-owner of Pajiba. You may email him here, follow him on Twitter, or listen to his weekly TV podcast, Podjiba.