Ofcom Set to Oversee Social Media Platforms as ‘Online Harms’ Regulator


UK communications regulator Ofcom is set to be given responsibility for regulating ‘online harms’, ensuring UK users are protected from harmful, offensive and illegal content online. Ofcom’s expanded role will give it oversight of tech companies like Facebook and Google for the first time. The news comes as the UK government lays out its response to the public consultation on the Online Harms White Paper which was published last year, asking what role the government and regulators should play in protecting citizens from harmful content online.

In the Online Harms White Paper the UK government pledged to introduce a new “duty of care” on companies distributing user-generated content online, through comments, forums, or video sharing. The exact details of what this duty of care will impose on companies still need to be worked out. But the government says it will be designed to ensure that all relevant companies have appropriate systems and processes in place to respond to complaints about harmful content, which could either be illegal content, or content which is legal but has the potential to cause harm (for example cyberbullying or hateful comments) . Businesses affected by the law will also have to build in transparency into their decision making processes for when they do and don’t take action.

Platforms will be expected to remove illegal content quickly and minimise the risk of it appearing, especially for terrorist content and online child sexual abuse. For legal but potentially harmful content, companies won’t be required to remove or block specific pieces or types of content, but will be expected to explicitly state what content and behaviour they allow on their sites, and to enforce these rules transparently and consistently.

The large social platforms already have systems in place to detect and remove harmful content, but whether these systems are sufficient or not will depend on the exact law the government ends up passing.

Ofcom will be charged with ensuring companies are following these new rules, and handing out punishments when the rules are broken. The government said in its official statement it is opting to hand new powers to Ofcom rather than setting up a new regulator because of Ofcom’s “organisational experience, robustness, and experience of delivering challenging, high-profile remits across a range of sectors”. It added that “Ofcom’s focus on the communications sector means it already has relationships with many of the major players in the online arena”.

The government estimates the new laws will affect five percent of UK businesses. But it says Ofcom will be charged with taking a “proportionate and risk-based approach”, meaning that the burdens placed on specific companies will depend on their size, and the risk of harm occurring. Social media platforms, where huge volumes of content are distributed to millions of people, will therefore face higher expectations than smaller businesses where the risk of harm is smaller.

The government has not yet decided the types or sizes of punishments Ofcom will be able to hand out to those that aren’t sufficiently protecting their users. The government’s statement said that while Ofcom will be expected to enforce the laws in a “fair, proportionate and transparent way”, it is also important that potential punishments are serious enough to sufficiently incentivise company executives to take online safety seriously.

And Home Secretary Priti Patel hinted that strong punishments might be necessary to ensure big tech companies comply with the new laws. “It is incumbent on tech firms to balance issues of privacy and technological advances with child protection,” she said. “That’s why it is right that we have a strong regulator to ensure social media firms fulfil their vital responsibility to vulnerable users.”


Subscribe to Weekly VAN Newsletter

 
EuropeLegalMediaSocial Video