Maybe you put on some "Peppa Pig" or "Sofia the First" on the iPad while you try to get dinner ready or finish a coffee. But for some reason, halfway through your prep, you start to hear threats from a "gangster" version of your kids' favorite pigs. Or, even worse, your kid screams in horror as their beloved characters get assassinated.
That's what happend to Indiana mom Staci Burns, who heard her 3-year-old son cry out, "Mommy, the monsters scare me!" as she was cooking. Turns out, despite her using the YouTube Kids app on the iPad, he ended up watching a 10-minute video called "Paw Patrol Babies Pretend to Die Suicide by Anabelle Hypnotized," where crude versions of "Paw Patrol" characters die in various ways.
Yeah, parents are furious.
The thing is, Burns' experience, which was reported by the New York Times last week, isn't isolated. The article, as well as a deep dive by James Bridle on Medium, got parents talking about the disturbing videos their children have also been seeing on a platform meant to be family-friendly.
There are thousands of these off-brand videos that are violent, sexual or just plain sick. (We won't even link to them because ugh, why give them more views.) Think Spider-Man squeezing large water balloons until they explode in slow motion while he sits in an empty bathtub (seen by an Australian dad and his 3-year-old) or Peppa Pig parodies that have her getting tortured at the dentist or drinking bleach.
"The architecture (that Google and YouTube) have built to extract the maximum revenue from online video is being hacked by persons unknown to abuse children, perhaps not even deliberately, but at a massive scale," wrote Bridle. "These videos, wherever they are made, however they come to be made, and whatever their conscious intention (i.e., to accumulate ad revenue) are feeding upon a system which was consciously intended to show videos to children for profit. The unconsciously generated, emergent outcomes of that are all over the place."
YouTube has responded to reports, which the Verge wrote about as early as February, through a few policy changes. In August, the company announced it would no longer allow creators to monetize videos that used family-friendly characters inappropriately.
“We’re in the process of implementing a new policy that age-restricts this content in the YouTube main app when flagged," Juniper Downs, YouTube’s director of policy, told the Verge. "Age-restricted content is automatically not allowed in YouTube Kids."
The company said they have thousands of people working around the clock looking for inappropriate content and reviewing flagged content. The new policy, which YouTube said has been formulating for a while and is not in direct response to recent reports, should be live within a few weeks.
In the meantime, rethink your screen rules. Maybe watch YouTube with your kids or have what they're watching in plain sight (say through a TV in the living room). If you see anything disturbing, you can block the video or channel and report it.