Digital Secretary Oliver Dowden and Home Secretary Priti Patel have announced the government's final decisions on new internet censorships laws.
New rules to be introduced for nearly all tech firms that allow users to post their own content or interact
Firms failing to protect people face fines of up to ten per cent of turnover or the blocking
of their sites and the government will reserve the power for senior managers to be held liable
Popular platforms to be held responsible for tackling both legal and illegal harms
All platforms will
have a duty of care to protect children using their services
Laws will not affect articles and comments sections on news websites, and there will be additional measures to protect free speech
The full government response to the Online Harms White Paper
consultation sets out how the proposed legal duty of care on online companies will work in practice and gives them new responsibilities towards their users. The safety of children is at the heart of the measures.
Social media
sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The
Government is also progressing work with the Law Commission on whether the promotion of self harm should be made illegal.
Tech platforms will need to do far more to protect children from being exposed to harmful content or
activity such as grooming, bullying and pornography. This will help make sure future generations enjoy the full benefits of the internet with better protections in place to reduce the risk of harm.
The most popular social media
sites, with the largest audiences and high-risk features, will need to go further by setting and enforcing clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological
harm to adults. This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice.
Ofcom is now confirmed as the
regulator with the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK.
The legislation includes provisions to impose criminal sanctions on senior managers. The government will not hesitate to bring these powers into force should companies fail to take the new rules seriously - for example, if they do not
respond fully, accurately and in a timely manner to information requests from Ofcom. This power would be introduced by Parliament via secondary legislation, and reserving the power to compel compliance follows similar approaches in other sectors such as
financial services regulation.
The government plans to bring the laws forward in an Online Safety Bill next year and set the global standard for proportionate yet effective regulation. This will safeguard people's rights online
and empower adult users to keep themselves safe while preventing companies arbitrarily removing content. It will defend freedom of expression and the invaluable role of a free press, while driving a new wave of digital growth by building trust in
technology businesses.
Scope
The new regulations will apply to any company in the world hosting user-generated content online accessible by people in the UK or enabling them to privately or publicly
interact with others online.
It includes social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer
cloud storage sites and video games which allow online interaction. Search engines will also be subject to the new regulations.
The legislation will include safeguards for freedom of expression and pluralism online - protecting
people's rights to participate in society and engage in robust debate.
Online journalism from news publishers' websites will be exempt, as will reader comments on such sites. Specific measures will be included in the legislation
to make sure journalistic content is still protected when it is reshared on social media platforms.
Categorised approach
Companies will have different responsibilities for different categories of
content and activity, under an approach focused on the sites, apps and platforms where the risk of harm is greatest.
All companies will need to take appropriate steps to address illegal content and activity such as terrorism and
child sexual abuse. They will also be required to assess the likelihood of children accessing their services and, if so, provide additional protections for them. This could be, for example, by using tools that give age assurance to ensure children are
not accessing platforms which are not suitable for them.
The government will make clear in the legislation the harmful content and activity that the regulations will cover and Ofcom will set out how companies can fulfil their duty
of care in codes of practice.
A small group of companies with the largest online presences and high-risk features, likely to include Facebook, TikTok, Instagram and Twitter, will be in Category 1.
These
companies will need to assess the risk of legal content or activity on their services with "a reasonably foreseeable risk of causing significant physical or psychological harm to adults". They will then need to make clear what type of
"legal but harmful" content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently.
All companies will need mechanisms so people can easily report harmful content
or activity while also being able to appeal the takedown of content. Category 1 companies will be required to publish transparency reports about the steps they are taking to tackle online harms.
Examples of Category 2 services are
platforms which host dating services or pornography and private messaging apps. Less than three per cent of UK businesses will fall within the scope of the legislation and the vast majority of companies will be Category 2 services.
Exemptions
Financial harms will be excluded from this framework, including fraud and the sale of unsafe goods. This will mean the regulations are clear and manageable for businesses, focus action where
there will be most impact, and avoid duplicating existing regulation.
Where appropriate, lower-risk services will be exempt from the duty of care to avoid putting disproportionate demands on businesses. This includes exemptions
for retailers who only offer product and service reviews and software used internally by businesses. Email services will also be exempt.
Some types of advertising, including organic and influencer adverts that appear on social
media platforms, will be in scope. Adverts placed on an in-scope service through a direct contract between an advertiser and an advertising service, such as Facebook or Google Ads, will be exempt because this is covered by existing regulation.
Private communications
The response will set out how the regulations will apply to communication channels and services where users expect a greater degree of privacy - for example online instant
messaging services and closed social media groups which are still in scope.
Companies will need to consider the impact on user privacy and that they understand how company systems and processes affect people's privacy, but firms
could, for example, be required to make services safer by design by limiting the ability for anonymous adults to contact children.
Given the severity of the threat on these services, the legislation will enable Ofcom to require
companies to use technology to monitor, identify and remove tightly defined categories of illegal material relating to child sexual exploitation and abuse. Recognising the potential impact on user privacy, the government will ensure this is only used as
a last resort where alternative measures are not working. It will be subject to stringent legal safeguards to protect user rights.