Ofcom could be given powers to fine harmful sites and apps

13 August 2019   By Dr Lucy Brown, Editor

Government is considering handing powers to the media regulator to tackle harmful content on popular websites.

Ofcom would be able to fine tech companies up to 5% of their revenue if they failed to comply with their rulings about harmful content on their platforms.

It would be an interim measure until a separate regulator covering online harms was established in the future.

However, as the move is designed to fulfil an EU directive, it may be cancelled if the UK leaves the EU without a deal in October.

ofcom logo
Source: Ofcom

What are the proposals?

Under the Government's proposals, Ofcom would be authorised to issue video-sharing apps and websites with multi-million pound fines if they are judged to have failed to prevent young people from viewing harmful content on their platform.

This would include pornographic and violent content, but also other harmful material that Ofcom judged to be damaging to younger audiences.

They would be able to fine platforms up to 5% of their revenue and would also have the authority to block access to their platform in the UK if they failed to comply.

If the proposals go ahead, Ofcom would take control of the powers from September 2020, with a view to appointing a separate regulator later.

Complying with EU regulations

The requirement to regulate tech companies in this way stems from the EU's Audiovisual Media Services Directive (AVMSD) which the UK is currently bound to adhere to.

If the UK leaves the EU with a deal in place, the AVMSD will likely be part of the transition period at least, so it would be implemented as planned.

However, if a deal isn't signed off then the UK would not be bound by this directive and so it is unlikely the Department for Digital, Culture, Media and Sport would implement it.

That said, dealing with harmful online content has long been an issue for the Government in the UK, so the essence of the directive may yet be transposed into UK law.

The provisions of the Digital Economy Act 2017 included a requirement for internet service providers (ISPs) to block pornographic websites with no effective age verification measures in place.

However, due to a lack of communication with the European Commission where details of the policy should've been given to them, implementation of this has been delayed until at least the end of 2019. This has sparked frustration amongst campaigners.

On the other side of the argument, though, opponents of blocking point to the unintended consequences of widespread content filtering as damaging to legitimate businesses.

Time for a regulator?

Whether these latest plans are implemented or not, there has long been a question mark around the need for a specialist internet regulator in the UK that would police the internet in the same way that Ofcom polices the media.

The Online Harms White Paper consultation ran from April to July 2019 and the results are in the process of being analysed by the Government.

It included various options for the regulation of platforms to tackle online harms which included a new independent regulatory body and proposed the services and powers that should be within the scope of the new body.

Further details on the outcome of this consultation are expected to be released soon, although it is unclear whether this will fit neatly alongside the proposed AVMSD or duplicate some of its functions.

independent comparison

We are independent of all of the products and services we compare.

fair comparison

We order our comparison tables by price or feature and never by referral revenue.

charity donations and climate positive

We donate at least 5% of our profits to charity, and we have a climate positive workforce.