In recent months and years there's been a lot of talk about making internet companies deal with illegal content on their platforms - particularly where extremist material is concerned. Most of that talk has come from the UK and EU, and with Brexit on the way a new report says that the UK should hold tech companies liable for illegal content once its left the European Union.
As it stands big companies are given a bit of leeway. With the amount of content that gets published on social media platforms, it's impossible for companies like Google and Facebook to actually police it all. Instead they're only obligated to removed something when it gets reported, after which they become liable under EU law. A new report has claimed that once the UK can make its own laws, those companies should be liable no matter what.
The report is titled Intimidation in Public Life, and has been published by the Committee on Standards in Public Life. It focuses on online threats and intimidation that parliamentary candidates (among other groups) can experience.
“Currently, social media companies do not have liability for the content on their sites, even where that content is illegal. This is largely due to the EU E-Commerce Directive (2000), which treats the social media companies as ‘hosts’ of online content. It is clear, however, that this legislation is out of date.
Facebook, Twitter and Google are not simply platforms for the content that others post; they play a role in shaping what users see. We understand that they do not consider themselves as publishers, responsible for reviewing and editing everything that others post on their sites. But with developments in technology, the time has come for the companies to take more responsibility for illegal material that appears on their platforms."
The recommendation is that as soon as Brexit happens the government should enact new legislation that isn't hindered by any of the 'safe harbours' offered by the EU's e-commerce directive. That is provided the UK is not part of the single market that would see it beholden to EU law and regulation. The idea is that platforms will take a more active approach in tackling illegal content - including copyright infringement which is currently 'protected' by the EU's legislation.
“The government should seek to legislate to shift the balance of liability for illegal content to the social media companies away from them being passive ‘platforms’ for illegal content. Given the government’s stated intention to leave the EU Single Market, legislation can be introduced to this effect without being in breach of EU law.
We believe government should legislate to rebalance this liability for illegal content, and thereby drive change in the way social media companies operate in combating illegal behavior online in the UK.”
How this content is going to be policed isn't exactly clear. A lot of content gets posted on social media platforms, so much that i's virtually impossible to keep up with it all. Algorithms exist that can ease the burden on human moderators, but as we've seen in the past those systems have proven to be far from perfect. The EU's legislation giving the companies responsible some breathing room might seem like poor judgement to some, but it feels like an essential consolation.
Nobody wants the illegal content hanging around online, but while some could argue that tech companies should be willing to do more to get rid of it quickly there is that much of it that they can't possibly be aware of every last scrap of data at any given time. But then again asking people ion government to understand technology and how it works has proven to be a a very big ask in the past. Why change that now? [Gov.uk via TorrentFreak]