Ofcom to Start Regulating Social Video Platforms from November 1st


UK communications regulator Ofcom has today outlined new rules around harmful content and advertising for UK-based video service providers (VSPs), which it defines as platforms which allow users to upload and share videos online. Ofcom will have jurisdiction over any VSP which primarily bases its European operations in the UK, which it has previously indicated includes Twitch, TikTok, LiveLeak, Imgur, Vimeo and Snapchat.

The new requirements, which come into effect on November 1st, state that video service providers must take appropriate measures to “protect children (under 18s) from content which might impair their physical, mental or moral development”, according to Ofcom. VSPs must also take measures “to protect the general public from content inciting violence or hatred, and content constituting criminal offences relating to terrorism; child sexual exploitation and abuse; and racism and xenophobia”. And the new rules also cover ads which run on VSPs’ content, ensuring that inappropriate ads are shown to children, and commercial messages are clearly marked.

The tighter regulations come as a result of the European Union’s updated ‘Audiovisual Media Services Directive’, a piece of legislation designed to harmonise regulation of broadcasters across Europe, and to bring video service providers under broadcaster-like regulation. The UK government has effectively chosen Ofcom to enforce the principles of the directive in the UK, holding video service providers to account, and deciding what constitutes ‘appropriate measures’ for dealing with harmful content.

Ofcom, in the guidance published today, has outlined specific steps that VSPs might take to protect against harmful content. Many of these are basic, such as giving users an option to flag whether content is appropriate for children when they upload videos, and giving viewers mechanisms to flag inappropriate content.

But Ofcom says it will make judgement calls on whether VSPs are taking appropriate measures or not by looking at factors like the size and nature of the VSP in question, the harm that content might cause on the platform, and the nature of the content they’re restricting.

VSPs Given Time to Comply

Under the new law, Ofcom will have the ability to impose substantial fines on companies which are falling short of the new standards. Fines can reach up to five percent of the provider’s annual revenues or £250,000, whichever is greater. And the regulator will be able to start handing out these fines from November 1st.

But Ofcom says it will give VSPs time to get their businesses in order. “Our focus in the early regulatory period will be on working with providers to help them understand the new obligations and discuss any steps that are needed for them to come into compliance,” says Ofcom. “While Ofcom will have the power to take formal enforcement action from 1 November 2020, we expect to prioritise only the most serious potential breaches for formal enforcement action until our full guidance is published next year.”

And Ofcom emphasised that the new rules don’t mean that if harmful content slips through a VSP’s prevention mechanisms, they’ll automatically be fined. “Individual occurrences of illegal or harmful content will not necessarily result in us initiating formal enforcement action,” said the regulator. “However, we will not be precluded from doing so if they highlight a systemic failing in the measures taken by a platform.”

 


Subscribe to Weekly VAN Newsletter

 
Ad TechLegalSocial TVSocial VideoStandards

Leave a Reply