Google is Doing Something about the latest scandal it finds itself in (YouTube/comments/paedophiles), revealing a plan to boost its video content moderation team to as many as 10,000 people. That's 10,000 people who will be watching awful YouTube content, day in, day out, like the millions of children around the world their paymaster profits from.
We have to ask what this means for the company's AI ambitions, as on the Good News Days Google's happy to tell the world that its super-smart computers can solve the problems of the world, automatically, by themselves. But apparently, embarrassingly, this is obviously rubbish, because the company also somehow needs 10,000 human beings to sit in a factory and make accurate decisions about uploaded video content, because its AI can't, literally, tell the difference between an arse and an elbow fold.
And because advertisers are querying the tech giant's approach to moderation and threatening to blow down the house of cards Google has built out of blank cheques from multinationals, YouTube CEO Susan Wojcicki has had to step in to claim everything's fine, saying: "Human reviewers remain essential to both removing content and training machine learning systems because human judgement is critical to making contextualised decisions on content."