With a theatrical flourish clamouring to the 'won't somebody think of the children' mob, Ofcom has proposed a set of censorship rules that demand strict age/ID verification for practically ever single website that allows users to post content. On top of
that they are proposing the most onerous mountain of expensive red tape seen in the western world. There are few clever slight of hands that drag most of the internet into the realm of strict age/ID verification. Ofcom argues that nearly all websites
will have child users because 16 and 17 year old 'children' have more or less the same interests as adults and so there is no content that is not of interest to 'children'
And so all websites will have to offer content that is appropriate to all
age children or else put in place strict age/ID verification to ensure that content is appropriate to age.
And at every stage of deciding website policy, Ofcom is demanding extensive justification of decision made and proof of data used in making
decisions. The amount of risk assessments, documents, research, evidence required makes the 'health and safety' regime look like child's play.
On occasions in the consultation documents Ofcom acknowledges that this will impose a massive
administrative burden, but swats away criticism by noting that is the fault of the Online Safety Act law itself, and not Ofcom's fault.
Comment: Online Safety proposals could cause new
harms
See article from openrightsgroup.org
Ofcom's consultation on safeguarding children online exposes significant problems regarding the proposed implementation of age-gating measures. While aimed at protecting children from digital harms, the proposed measures introduce risks to cybersecurity,
privacy and freedom of expression.
Ofcom's proposals outline the implementation of age assurance systems, including photo-ID matching, facial age estimation, and reusable digital identity services, to restrict access to popular
platforms like Twitter, Reddit, YouTube, and Google that might contain content deemed harmful to children.
Open Rights Group warns that these measures could inadvertently curtail individuals' freedom of expression while
simultaneously exposing them to heightened cybersecurity risks.
Jim Killock, Executive Director of Open Rights Group, said:
Adults will be faced with a choice: either limit their freedom of
expression by not accessing content, or expose themselves to increased security risks that will arise from data breaches and phishing sites.
Some overseas providers may block access to their platforms from the UK rather than
comply with these stringent measures.
We are also concerned that educational and help material, especially where it relates to sexuality, gender identity, drugs and other sensitive topics may be denied to young people by
moderation systems.
Risks to children will continue with these measures. Regulators need to shift their approach to one that empowers children to understand the risks they may face, especially where young people may look for
content, whether it is meant to be available to them or not.
Open Rights Group underscores the necessity for privacy-friendly standards in the development and deployment of age-assurance systems mandated by the Online
Safety Act. Killock notes, Current data protection laws lack the framework to pre-emptively address the specific and novel cybersecurity risks posed by these proposals.
Open Rights Group urges the government to prioritize
comprehensive solutions that incorporate parental guidance and education rather than relying largely on technical measures.