UK adult sites not doing enough to protect children
Smaller adult video-sharing sites based in the UK do not have sufficiently robust access control measures in place to stop children accessing pornography, Ofcom has found in a
new report.
Ahead of our future duties in the Online Safety Bill, Ofcom already has some powers to regulate video-sharing platforms (VSPs) established in the UK, which are required by law to take measures to protect people using
their sites and apps from harmful videos.
Nineteen companies have notified us that they fall within our jurisdiction. They include TikTok, Snapchat, Twitch, Vimeo, OnlyFans and BitChute; as well as several smaller platforms,
including adult sites.
Ofcom is concerned that smaller UK-based adult sites do not have robust measures in place to prevent children accessing pornography. They all have age verification measures in place when users sign up to
post content. However, users can generally access adult content just by self-declaring that they are over 18.
One smaller adult platform told us that it had considered implementing age verification, but had decided not to as it
would reduce the profitability of the business.
However, the largest UK-based site with adult content, OnlyFans, has responded to regulation by adopting age verification for all new UK subscribers, using third-party tools provided
by Yoti and Ondato.
According to new research we have published today, most people (81%) do not mind proving their age online in general, with a majority (78%) expecting to have to do so for certain online activities. A similar
proportion (80%) feel internet users should be required to verify their age when accessing pornography online, especially on dedicated adult sites.
Over the next year, adult sites that we already regulate must have in place a
clear roadmap to implementing robust age verification measures. If they don't, they could face enforcement action. Under future online safety laws, Ofcom will have broader powers to ensure that many more services are protecting children from adult
content. Some progress protecting users, but more to be done
We have seen some companies make positive changes more broadly to protect users from harmful content online, including as a direct result of being regulated under the
existing laws. For example:
TikTok now categorises content that may be unsuitable for younger users, to prevent them from viewing it. It has also established an Online Safety Oversight Committee, which provides
executive oversight of content and safety compliance specifically within the UK and EU.
Snapchat recently launched a parental control feature, Family Center, which allows parents and guardians to view a list of their child's
conversations without seeing the content of the message.
Vimeo now allows only material rated all audiences to be visible to users without an account. Content rated mature or unrated is now automatically put behind the login
screen.
BitChute has updated its terms and conditions and increased the number of people overseeing and -- if necessary -- removing content.
However, it is clear that many platforms are not
sufficiently equipped, prepared and resourced for regulation. We have recently opened a formal investigation into one firm, Tapnet Ltd -- which operates adult site RevealMe -- in relation to its response to our information request.
We also found that companies are not prioritising risk assessments of their platforms, which we consider fundamental to proactively identifying and mitigating risks to users. This will be a requirement on all regulated services under
future online safety laws.
Over the next twelve months, we expect companies to set and enforce effective terms and conditions for their users, and quickly remove or restrict harmful content when they become aware of it. We will
review the tools provided by platforms to their users for controlling their experience, and expect them to set out clear plans for protecting children from the most harmful online content, including pornography.