Making bullshit copyright claims to get content removed from YouTube remains a tried-and-true way to screw with creators, which is why the company’s plan to overhaul its “three-strikes” policy sounds like the much-needed improvement users have been asking for. But confusingly, YouTube’s new policy update has absolutely nothing to do with copyright.
Today, YouTube announced in a blog post changes to its strikes system as it applies to content flagged for violating the company’s Community Guidelines. Starting February 25, the first time a creator’s content is flagged, they will get a one-time warning and their flagged content will be removed. Prior to this change, there was no warning, and a first strike would result in a 90-day freeze on livestreaming. A second strike would result in a two-week freeze on video uploads.
While strikes expire after 90 days, the warning does not. Now, after the warning, YouTube users will receive the first strike, which will put a one-week freeze on their ability to upload new content or livestream. A second strike in a 90-day span will lead to a two-week freeze, and a third strike in that time period will lead to the termination of their channel.
The issue of bogus copyright claims has been around for years, but a recent report from the Verge illustrating how YouTube creators are being extorted by false copyright complaints brought it back to the forefront. YouTubers have also recently posted exasperated Reddit threads about the service’s broken copyright system. But YouTube’s updated policy doesn’t fix crucial flaws in its strike system as it applies to copyright issues, which are handled by a separate system from strikes applied to Community Guidelines violations. (To make matters more confusing, YouTube lists copyright violations under its Community Guidelines on its policy page.)
A YouTube spokesperson told Gizmodo that the strike policies for Community Guidelines and copyright are entirely separate, and that YouTube employees don’t get directly involved copyright disputes. The spokesperson said that if YouTube receives a DMCA takedown notice, they must legally comply with that request and remove the content. Community Guidelines violations—like harmful, hateful, violent content, and spam—are reviewed and decided upon by members of the YouTube team, according to the spokesperson.
Which, again, means that YouTube’s updated policy doesn’t fix the thing users have been mad about—copyright abuse.
None of this is to say that Tuesday’s update isn’t an improvement: It cushions the blow with a warning and more clearly outlines the subsequent penalties. But because it doesn’t apply to creators who are being flagged for copyright infringement, the latest strikes system update fails to prevent trolls from making false copyright complaints, and it certainly doesn’t remedy this system’s potentially dangerous appeal process.
If a YouTube creator wants to fight a takedown request citing fair use, they have to submit a counter-notice with a lot of personal information. This includes their name, email address, physical address, and phone number, which are then sent to the individual who filed the copyright complaint. In instances where someone is being trolled or extorted, they are handing over private information to their abuser.
“We’ll build on this and all the progress we’ve made over the last year by continuing to consult with you as we strengthen enforcement and update our policies,” YouTube wrote in its Tuesday blog post. “We want to make sure they’re easy to understand and address the needs of the global YouTube community.”
But there is still a glaring need that is not being addressed, one that the community has hardly been silent about.
Featured image: Getty