Brief comments on two proposed new criminal offences relating to pornography: strangulation / suffocation, and sex (actually or purportedly) between relatives
As published on: by Neil Brown
A couple of people have asked me about some of the proposed amendments to the UKs Crime and Policing Bill , which is
currently going through Parliament.
Please note that these are proposed amendments and, as such they are not (yet) law. They may never become law, or may be changed, materially or otherwise, before they become law.
This blogpost contains sexual themes
As the title of this blogpost suggests, this blogpost is about legislation which has sexual themes. In particular:
strangulation / suffocation
sex between relatives
These offences relate to images, not acts
The proposed new offences which I discuss below relate to images of acts, and not the acts themselves.
They do not impact ostensibly
the legality (or otherwise) of doing the things depicted in the images.
Pornographic images of strangulation or suffocation Background
media sources such as pornography have effectively established strangulation during sex as a sexual norm, and a belief that strangling a partner during sex is safe because it is believed to be
non-fatal despite overwhelming evidence that is is believed there is no safe way to strangle a person.
It is an offence for a person to be in possession of an image if--
the image is pornographic, within the meaning of section 63 (i.e. that 'it is of such a nature that it must reasonably be assumed
to have been produced solely or principally for the purpose of sexual arousal'),
the image portrays, in an explicit and realistic way, a person strangling or suffocating another person, and
a
reasonable person looking at the image would think that the persons were real.
Neither 'strangling' nor 'suffocating' is defined.
My working assumption is that 'strangulation' entails a depiction of putting something (hands or otherwise) around another persons neck which
applies pressure or compression to the throat.
Strangulation does not require a particular level of pressure or force within its ordinary meaning, it does not require any injury and it does not require proof of
a consequence such as impeded breathing or circulation.
My feeling is that 'suffocation' covers any means of adversely impacting someones breathing, or depriving someone of air, making it wider than 'strangulation',
and encompassing what might be termed 'breath play'. It could entail putting something down someones throat, for instance, or covering their nose and mouth. The CPS guidance suggests - again, in a somewhat different context - a broad interpretation.
Since the offence, as currently posited, requires 'a person strangling or suffocating another person', it would appear that an image of a person strangling / suffocating themselves is not covered. As such, I should be surprised if
this prohibited an image of someone wearing a tie or collar (for instance). This outcome would seem to be consistent with the governments focus on partnered sexual activity and violence against women.
'Image' means both a moving
or still image, and data which is capable of conversion into an image, but the portrayal must be 'realistic', and the people depicted must look 'real' to a reasonable person, for the image to be in scope.
This is an image-based
offence, and does not impact text-based pornography / erotica, although one would still need to be mindful of the law of obscenity .
Note that the existing legislation relating to 'extreme pornography' already covers the 'explicit and realistic' portrayal of 'an act
which threatens a persons life', which could include both strangulation and suffocation. This offence would remain in place.
Proposed defences
Of the proposed defences to the offence of possession,
one is:
that the person directly participated in the act portrayed and the act did not involve the infliction of any non-consensual harm on any person.
It would be a separate
offence to 'publish' such an image, which includes 'giving or making it available to another person by any means'.
One of the proposed defences to the 'publication' offence is:
that the person
directly participated in the act portrayed, the act did not involve the infliction of any non-consensual harm on any person, and the person only published the image to other persons who directly participated.
Non-consent for adults must be distinguished from consent to relinquish control. The presence of a 'gag' or other forms of bondage does not, without more, suffice to confirm that sexual activity was non-consensual.
As far as I know, 'harm' is not, in itself, defined.
While the defence would permit sharing an image with the other participants, it would preclude the private dissemination of such imagery, outside the
(direct) participants to it, and would prohibit the sharing of the image online or with social media groups.
Possession or publication of pornographic images of sex between relatives, and images where one person is pretending
to be under 18
A separate amendment relates to the possession or publication of pornographic images of sex between relatives.
I understand that this is pretty common subject matter of some 'tube' sites.
The first of
these vital measures will ban anyone from possessing or publishing harmful pornography that shows incest between family members, and sex between step or foster relations where one person is pretending to be under 18.
A further
amendment will criminalise the publication and possession of pornography where an adult is roleplaying as a child.
Because of this 'further amendment', there has been a significant change in the amendment between the
House of Lords and the House of Commons.
House of Lords proposed offence
The House of Lords proposed a criminal offence of possession or publication of realistic images depicting sexual penetration
of one person by another (my paraphrasing) where:
In other words, while the image may be acted, if the context - the title, description, language used by participants etc. - indicated that the participants are related or were pretending to be, and there was sexual
penetration of one person by another, it would fall within scope of this offence.
Given the presence of 'pretending to be', it is possible that someone could look to make a case that use of a term like 'daddy' was sufficient to
formulate the offence.
House of Commons proposed offence, including 'under 18'
The House of Commons has objected to this amendment, proposing its own, slightly tweaked, version:
The HoC proposal is for a criminal offence of possession or publication of realistic images depicting sexual penetration of one person by another (again, my paraphrasing) where:
a reasonable
person--
looking at the image, and
taking into account any sound or information associated with the image,
would think what is set out in subsection (1A) or (1B).
1A is:
That A and B were related, or pretending to be related, such that A was related to B as parent
[(including adoptive parent)], grandparent, child, grandchild, brother, sister, half-brother, half-sister, uncle, aunt, nephew or niece.
1B is entirely new, and covers separate subject matter:
That A and B were related or had been related, or were pretending to be related or to have been related, such that A was or had been related to B as step-parent, step-child, stepbrother, stepsister, foster parent or foster child,
and
at least one of A and B was, or was pretending to be, under 18.
As with the offence relating to images of strangulation / suffocation, this is an image-based offence, and does not impact text-based pornography / erotica.
Because of the requirement of multiple
participants ('another person'), images of one person, alone, would appear not to be covered, nor would images (of one or multiple people) which do not depict realistic and explicit penetrative sex.
The comments above about 'non-consensual harm' apply here.
It would appear that, as long as the participants were not actually related, a participant may possess an image in which they pretend
to be related.
In respect of publication, there is an additional proposed limb, that:
the person only published the image to person B or A (as the case may be).
Unlike the drafting in respect of the offences relating to images of strangulation / suffocation, which appear to cater for images depicting more than two participants, I am not sure how the defence proposed here works where there are
multiple simultaneous participants: distribution to all participants, as opposed to one particular participant, could be problematic.
In any case, this too would preclude the private dissemination of such imagery to
non-participants, and would prohibit the sharing of the image online or with social media groups.
In line with other stakeholder groups, academics and public policy institutions, Aylo's assessment is that the Online Safety Act (OSA) has not achieved its intended goal of
protecting minors. Effective February 2, 2026 Aylo will no longer participate in the failed system that has been created in the United Kingdom as a result of the OSA's introduction. Based on Aylo's data and experience, this law and regulatory framework
have made the internet more dangerous for minors and adults and jeopardizes the privacy and personal data of UK citizens.
New users in the UK will no longer be able to access Aylo's content sharing platforms, including Pornhub,
YouPorn, and Redtube. UK users who have verified their age will retain access through their existing accounts.
Statement by Alex Kekesi, VP Brand and Community, on behalf of Aylo:
We've made the difficult
decision to restrict access to our sites (user-uploaded content platforms, including Pornhub, YouPorn, Redtube) in the United Kingdom.
As of February 2, 2026, our library of thoroughly moderated and consensual adult entertainment,
on one of the most trusted adult sites in the world, will be restricted. Our sites, which host legal and regulated porn, will no longer be available in the UK to new users, but thousands of irresponsible porn sites will still be easy to access.
Aylo initially participated in the Online Safety Act (OSA) because we wanted to believe that a determined and prepared regulator in Ofcom could take poor legislation and manage to enforce compliance in a meaningful way, while offering
more privacy preserving age assurance methods than we'd seen in other jurisdictions. Despite the clear intent of the law to restrict minors' access to adult content and commitment to enforcement, after 6 months of implementation, our experience strongly
suggests that the OSA has failed to achieve that objective. We cannot continue to operate within a system that, in our view, fails to deliver on its promise of child safety, and has had the opposite impact. We believe this framework in practice has
diverted traffic to darker, unregulated corners of the internet, and has also jeopardized the privacy and personal data of UK citizens.
In October we met with both the government department which authored the OSA, and the UK
regulator responsible for enforcing it, to re-iterate our concerns about the law's vulnerabilities. We presented data and continued advocating for a device-based solution and are disappointed that despite the evidence shared, such little progress has
been made, especially when an alternative and viable solution exists.
Aylo consulted with the regulator and was committed to giving the OSA every chance to succeed, but we believe Ofcom was given an impossible mandate. In our
view, it is clear this is too big a challenge for any regulator to execute within the parameters of the Act. Based on our data and experience, effective enforcement is not possible, circumvention is rampant, privacy is compromised, and new, unregulated
sites quickly fill any gaps left by responsible operators. In other jurisdictions, Aylo has often been one of the only major platforms to comply, only to see traffic diverted to even larger, non-compliant sites. Although larger operators are compliant,
we believe the OSA has created an ecosystem where the vast majority of sites with age-inappropriate content are left unchecked. Users are turning to sites that do not have uploader verification measures and do not moderate content, leading to an
increased risk of exposure to dangerous or illegal content. What is alarming about the top 10 Google and Bing search results for free porn in the UK is not just that more than half of the sites lack any age verification, but that the specific sites keep
changing. (January 20, 2026) Search results are constantly replenished with new, non-compliant sites, demonstrating how easily new players can enter the market. This revolving door means the internet remains wide open to unmoderated, unverified and
potentially unsafe content, regardless of how many times authorities attempt to crack down. The longer this goes on, these non-compliant sites are accessed by more minors and adults. As outlined by Lucy Faithfull Foundation, we know that when faced with
age verification, some adults are choosing riskier, irresponsible sites to avoid age checks. Alternatively, they seek out solutions to circumvent restrictions by using virtual private networks (VPNs) to connect to the same websites via a different
country.
We remain committed to working with the UK, European Commission and other international partners to ensure the lessons learned in the UK inform future policymaking. We continue to believe that to make the internet safer
for everyone, every phone, tablet or computer should start as a kid-safe device. We've seen progress in this space with Apple's recent iOS 26.1 update, which enables built-in content filters to limit adult websites by default on existing minors' accounts
and requires parental consent to disable them. This method blocks access to known adult content websites, cannot be circumvented with VPNs and does not introduce any data privacy risks. This is a big step for online safety that can go even further. We
encourage all device manufacturers to make this the default setting on all devices, not just known minor accounts, to better protect everyone. Laws should mandate that only adults be allowed to unlock access to age-inappropriate content. We are
determined to be part of this solution and want to collaborate with government, civil society and tech partners to arrive at an effective device-based age verification solution.
The BBC makes an
interesting comment with reference to an upcoming censorship law that will ban choking and strangulation content on porn websites. How on earth are foreign porn websites expected to implement such a ban on material that is so commonplace, just for the
UK. Perhaps the answer is to simply self block in the UK with the knowledge that keen UK users with a VPN can still access it.
The BBC article notes:
Anti porn campaigner Prof Clare McGlynn believes Pornhub would prefer VPN usage
to having to regulate or moderate its content more, particularly as the UK looks to restrict more material. The UK government recently announced plans to make online porn showing strangulation or suffocation illegal.
On VPNs being used to get around
checks, social media expert Matt Navara says Pornhub's decision to restrict UK access may be more about creating a legal firewall about restrictions than a protest. He said:
I think blocking UK access lets Pornhub
dodge some of the regulations, skip the costs and still collect the traffic from users they can no longer see.
Ofcom has set out the next steps in its investigation into X, and the limitations of the UKs Online Safety Act in relation to AI chatbots.
Ofcom was one of the first regulators in the world to act
on concerning reports of the Grok AI chatbot account on X being used to create and share demeaning sexual deepfakes of real people, including children, which may amount to criminal offences.
After contacting X on 5 January ,
giving it a chance to explain how these images had been shared at such scale, we moved quickly to launch a formal investigation on 12 January into whether the company had done enough to assess and mitigate the risk of this imagery spreading on its social
media platform, and to take it down quickly where it was identified.
Since then, X has said it has implemented measures to try and address the issue. We have been in close contact with the Information Commissioners Office, which
is launching its own investigation. Other jurisdictions have also launched investigations in the weeks since we opened ours, including the European Commission on 26 January.
Our investigation remains ongoing and we continue to
work closely with the ICO and others to ensure tech firms keep users safe and protect their privacy.
Not all AI chatbots are regulated
Broadly, the Online Safety Act regulates user-to-user services,
search services and services that publish pornographic content.
Chatbots are not subject to regulation at all if they:
only allow people to interact with the chatbot itself and no other users (i.e. they are not user-to-user services);
do not search multiple websites or databases when giving responses to users (i.e. are
not search services); and
cannot generate pornographic content.
We are not investigating xAI at this time.
When we opened our investigation into X, we said we were assessing whether we should also investigate xAI, as the provider of the standalone Grok service. We
continue to demand answers from xAI about the risks it poses. We are examining whether to launch an investigation into its compliance with the rules requiring services that publish pornographic material to use highly effective age checks to prevent
children from accessing that content.
Because of the way the Act relates to chatbots, as explained above, we are currently unable to investigate the creation of illegal images by the standalone Grok service in this case.
Where we are in our X investigation
In our investigation into X, we are currently gathering and analysing evidence to determine whether X has broken the law, including using our formal
information-gathering powers. The week after we launched our investigation, we sent legally binding information requests to X, to make sure we have the information we need from the company, and further requests continue to be sent.
Firms are required, by law, to respond to all such requests from Ofcom in an accurate, complete and timely way, and they can expect to face fines if they fail to do so.
We must give any company we investigate a
full opportunity to make representations on our case. If, based on the evidence, we consider that the company has failed to comply with its legal duties, we will issue a provisional decision setting out our views and the evidence upon which we are
relying. The company will then have an opportunity to respond to our findings in full, as required by the Act, before we make our final decision.
We know there is significant public interest in our investigation into X. We are
progressing the investigation as a matter of urgency. We will provide updates and will be as open as possible during this process. It is important to note that enforcement investigations such as these take time -- typically months.
We must follow strict rules about how and when we can share information publicly, as is the case for any enforcement agency, and it would not be appropriate to provide a running commentary about the substantive details of a live
investigation. Running a fair process is essential to ensuring that any final decisions are robust, effective, and that they stick.
While in the most serious cases of ongoing non-compliance we can apply for a court order requiring
broadband providers to block access to a site in the UK, the law sets a high bar for such applications, and a specific process must be followed before we can do this. It would be a significant regulatory intervention and is not one we are likely to make
routinely, given the impact it could have on freedom of expression in the UK.
A new survey of 1,469 adults, which was conducted by the child protection focused Lucy Faithfull Foundation, has noted that 45% of adults who dont want to verify their identities to access porn have turned to using sites without age checks. In addition,
29% have used Virtual Private Networks to bypass age checks on sites that have them.
Kerry Smith, CEO of the Foundation, said:
Its highly concerning that age verification measures are not being implemented on
certain platforms. Safeguards on pornography sites are essential to protect children from accessing pornography, which we know, if viewed at a young age, can normalise harmful sexual behaviours and leave children more vulnerable to grooming from
predators.
There needs to be strong enforcement of the Online Safety Act to ensure robust and meaningful safety measures are put in place on pornography platforms, including the use of deterrence messaging and signposting for
adults to appropriate support services.
We would also encourage the government to bring in even more robust legislation, so online pornography is treated just as it is in the offline world.
An Ofcom spokesperson
said
Change is happening, and the tide on online safety is beginning to turn for the better. Last year saw important changes for people, with new measures across many sites and apps now better protecting UK users from
harmful content, particularly children. But we need to see much more from tech companies this year, and well use our full powers if they fall short.
Ofcom does have the power to impose significant financial fines, although there
remains a question mark as to how much impact this will have on non-UK based sites. The regulator could also ask broadband ISPs and mobile operators to block the sites at network-level, although this would have little impact on VPN users.
Overall,
its hardly surprising or controversial that many adults do not want to have to share their private personal or financial details with unknown and unregulated third-party age verification providers, particularly when those services are associated with
porn peddlers. The infamous Ashley Madison hack showed just how dangerous such information could be in the wrong hands (countless cases of blackmail and suicide etc.).