Google, along with fellow tech giants like Amazon, Facebook, and Twitter has drawn increasing scrutiny this year over concerns that its “concentrated authority resembles the divine right of kings,” as the New York Times put it. In recent months, it’s faced stumbling blocks when it returned misinformation and conspiracy theories during crises like mass shootings and became embroiled in the ever-expanding Russian electoral interference scandal. But one particularly disturbing note concerned Google subsidiary YouTube and its YouTube Kids section, which everyone seems to have recently realised was promoting weird-ass, creepy content to children via algorithmically suggested videos and a seeming lack of moderation.
You know, casual stuff like videos of Peppa Pig being tortured by her dentist and drinking bleach, or videos of Spiderman hitting on bikini babes. While parenting styles vary, most would not want their four-year-olds to be watching this stuff. Other concerns have included terrifying videos featuring actual children in potentially hazardous situations. It’s not a new trend, but it has begun to attract major attention.
Now Google appears to be taking action, TechCrunch reported. In a blog post on Wednesday, YouTube explained that it would be taking a series of actions to crack down on a “growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not.”
According to YouTube, it’s already closed down some 50 channels with content “featuring minors that may be endangering a child, even if that was not the uploader’s intent,” as well as deleted thousands of videos. It’s also implementing automatic age restriction of videos featuring “family entertainment characters but containing mature themes or adult humour.” The video giant also claimed to be investing in machine learning and other tools to escalate violations to human moderators.
YouTube also said it would be demonetising all videos including said family entertainment characters depicted in violent or offensive situations—removing at least some of the incentive to make them in the first place—as well as disabling comment sections when its combination of machine and human moderators detects inappropriate or illegal behaviour. Finally, it added that it would work to clarify standards and guidelines for content creators as well as include more “experts” in its policy formation and moderation processes.
This is a lot of talk, but with millions of videos already on the site, eliminating the vast majority of the content is probably impossible and even keeping it from children is the kind of labor-intensive project that is easier said than done. In the meantime, there’s a simple solution suggested by Gizmodo colleague Adam Clark Estes: Maybe don’t let your little kids on YouTube unsupervised. [TechCrunch]