Today, YouTube clarified how it plans to handle videos that don’t violate any of its policies but still contain offensive religious and supremacist content: hide them and make sure they can’t make any money.
The news comes as a status report on the promises made by Google general counsel Kent Walker in a June Financial Times op-ed, which announced YouTube was taking several steps to inhibit extremist videos. These steps included investing in machine-learning technology to help identify videos associated with terrorism, increasing the number of “Trusted Flaggers” to identify content that can be used to radicalise terrorists, and redirecting potential extremist recruits to watch counterterrorism videos instead. Walker also wrote that YouTube would take a “tougher stance” on controversial videos that don’t actually violate any YouTube policies.
In a blog post today, YouTube provided a better sense of what that stance entails. Now, when YouTube decides that a flagged video doesn’t break policy but still contains “controversial religious or supremacist content,” the video will be put in a “limited state.” Here, the video will exist in a sort of limbo where it won’t be recommended or monetised. It also won’t include suggested videos or allow comments or likes.
This new approach will apply to desktop versions of YouTube within the next few weeks and on mobile soon after that.
Of course, when YouTube removes a video, the video can easily be re-uploaded or a mirrored version can spread to different channels. Often, a video removal only brings more attention to the video and encourages people to re-upload it so it can reach a wider audience. So this decision by YouTube seems like a calculated effort to prevent the spread of offensive content without fully censoring it.
Right-wing alternative media figures and conspiracy theorists have been complaining for weeks that YouTube is already manipulating the algorithms to limit their reach and revenue. And this announcement will likely only add to their resentment of a platform that they rely upon to reach their audiences.
YouTube also announced today that it has added to the list of NGOs it is working with to help determine what content should be hidden. These organisations include the No Hate Speech Movement, the Institute for Strategic Dialogue, and the Anti-Defamation League, which recently drew the ire of far-right outlets and pundits by publishing a list of alt right and “alt lite” personas.
“These organisations bring expert knowledge of complex issues like hate speech, radicalisation, and terrorism that will help us better identify content that is being used to radicalise and recruit extremists,” YouTube stated today. “We will also regularly consult these experts as we update our policies to reflect new trends.”
YouTube did not immediately respond to a request for comment.
The update also touted the success of the machine-learning-driven removal of content, claiming that over the last month, YouTube algorithms have found 75 per cent of policy-violating extremist content before a human was able to flag the videos. [YouTube]