|
So peer Floella Benjamin attempts to revive porn age verification censorship because porn viewing is just one step away from park murder
|
|
|
| 17th March
2021
|
|
| 14th March 2021. See article from bills.parliament.uk
|
The pro-censorship member of the House of Lords has tabled the following amendment to the Domestic Abuse Bill to reintroduce internet porn censorship and age verification requires previously dropped by the government in October 2019. Amendment 87a
introduces a new clause: Impact of online pornography on domestic abuse
Within three months of the day on which this Act is passed, the Secretary of State must commission a person appointed by the Secretary of State to investigate the impact of access to online pornography by children on domestic
abuse. Within three months of their appointment, the appointed person must publish a report on the investigation which may include recommendations for the Secretary of State. As part of the
investigation, the appointed person must consider the extent to which the implementation of Part 3 of the Digital Economy Act 2017 (online pornography) would prevent domestic abuse, and may make recommendations to the Secretary of State accordingly.
Within three months of receiving the report, the Secretary of State must publish a response to the recommendations of the appointed person. If the appointed person recommends that Part 3 of the Digital
Economy Act 2017 should be commenced, the Secretary of State must appoint a day for the coming into force of that Part under section 118(6) of the Act within the timeframe recommended by the appointed person."
Member's explanatory statement This amendment would require an investigation into any link between online pornography and domestic abuse with a view to implementing recommendations to bring into effect
the age verification regime in the Digital Economy Act 2017 as a means of preventing domestic abuse. Update: Defeated 17th March 2021. See
article from votes.parliament.uk The amendment designed to resurrect the Age Verification clauses of the Digital Economy Act 2017 was
defeated by 242 to 125 vodets in the House of Lords. The government minister concluding the debate noted that the new censorship measures included in the Online Harms Bill are more comprensive than the measures under Digital Economy Act 2017. He
also noted that although upcoming censorship measures would take significant time to implement but also noted that reviving the old censorship measures would also take time. In passing the minister also explained one of the main failings of the
act was that site blocking would not prove effective due to porn viewers being easily able to evade ISP blocks by switching to encrypted DNS servers via DNS over Https (DoH). Presumably government internet snooping agencies don't fancy losing the ability
to snoop on the browsing habits of all those wanting to continue viewing a blocked porn site such as Pornhub. |
|
|
|
|
|
25th February 2021
|
|
|
Rather than genuinely tackling the thornier issues, we're seeing calls for more regulations online as a quick fix. By Ruth Smeeth See
article from indexoncensorship.org |
|
Floella Benjamin attempts to resuscitate internet porn age verification in a Domestic Abuse Bill
|
|
|
| 11th
February 2021
|
|
| See Government statement about age verification (11th
January 2021) from questions-statements.parliament.uk See
attempt to resuscitate porn age verification in the
Domestic Abuse Bill (10th February 2021) from hansard.parliament.uk |
Campaigners for the revival of deeply flawed and one sided age verification for porn scheme have been continuing their efforts to revive it ever since it was abandoned by the Government in October 2019. The Government was asked about the possibility
of restoring it in January 2021 in the House of Commons. Caroline Dinenage responded for the government: The Government announced in October 2019 that it will not commence the age verification provisions of Part 3 of
the Digital Economy Act 2017 and instead deliver these protections through our wider online harms regulatory proposals. Under our online harms proposals, we expect companies to use age assurance or age verification technologies to
prevent children from accessing services which pose the highest risk of harm to children, such as online pornography. The online harms regime will capture both the most visited pornography sites and pornography on social media, therefore covering the
vast majority of sites where children are most likely to be exposed to pornography. Taken together we expect this to bring into scope more online pornography currently accessible to children than would have been covered by the narrower scope of the
Digital Economy Act. We would encourage companies to take steps ahead of the legislation to protect children from harmful and age inappropriate content online, including online pornography. We are working closely with stakeholders
across industry to establish the right conditions for the market to deliver age assurance and age verification technical solutions ahead of the legislative requirements coming into force. In addition, Regulations transposing the
revised Audiovisual Media Services Directive came into force on 1 November 2020 which require UK-established video sharing platforms to take appropriate measures to protect minors from harmful content. The Regulations require that the most harmful
content is subject to the strongest protections, such as age assurance or more technical measures. Ofcom, as the regulatory authority, may take robust enforcement action against video sharing platforms which do not adopt appropriate measures.
Now during the passage of the Domestic Abuse in the House of Lords, Floella Benjamin attempted to revive the age verification requirement by proposing the following amendment: Insert the following new
Clause -- Impact of online pornography on domestic abuse (1) Within three months of the day on which this Act is passed, the Secretary of State must commission a person appointed by the
Secretary of State to investigate the impact of access to online pornography by children on domestic abuse. (2) Within three months of their appointment, the appointed person must publish a report on the investigation which may
include recommendations for the Secretary of State. (3) As part of the investigation, the appointed person must consider the extent to which the implementation of Part 3 of the Digital Economy Act 2017 (online pornography) would
prevent domestic abuse, and may make recommendations to the Secretary of State accordingly. (4) Within three months of receiving the report, the Secretary of State must publish a response to the recommendations of the appointed
person. (5) If the appointed person recommends that Part 3 of the Digital Economy Act 2017 should be commenced, the Secretary of State must appoint a day for the coming into force of that Part under section 118(6) of the Act
within the timeframe recommended by the appointed person.
Member's explanatory statement This amendment would require an investigation into any link between online pornography and
domestic abuse with a view to implementing recommendations to bring into effect the age verification regime in the Digital Economy Act 2017 as a means of preventing domestic abuse.
Floella Benjamin made a long speech supporting the
censorship measure and was supported by a number of peers. Of course they all argued only from the 'think of the children' side of the argument and not one of them mentioned trashed adult businesses and the risk to porn viewers of being outed, scammed,
blackmailed etc. See Floella Benjamin's
speech from hansard.parliament.uk |
|
|
|
|
| 17th January 2021
|
|
|
Response to the Lords Communications Committee enquiry into freedom of expression online See
article from openrightsgroup.org |
|
Open Rights Group respond to Law Commission proposals to extend hate crime definitions to pander to the easily offended
|
|
|
| 14th
January 2021
|
|
| See consultation response from openrightsgroup.org
See Law Commission Proposals [pdf] from s3-eu-west-2.amazonaws.com |
Open Rights Group writes: This is a short joint submission to the Law Commission's Harmful Online Offences consultation. This submission is by the Open Rights Group and Preiskel & Co LLP solicitors. Opposing the new offence
We do not support the Law Commission's proposed offence. We are concerned with its breadth. We echo and adopt Article 19's submissions in this regard. The threshold of a "likelihood to harm"
appears to be very broad, and it could include many communications which could cause distress to readers, as the result of their strongly-held religious, political or cultural beliefs, but be legitimate discourse. The
"Intent to harm or awareness of the risk of harming a likely audience" compounds this. "Risk" as a threshold seems very low. It appears to open up prosecution to anyone whose postings can be related to someone who has experienced
mental distress as a result of reading those communications. "Likely audience" again is in our view vague and open to interpretation. Making communications "without reasonable excuse" reverses the normal
burden for speech: speech, protected as a fundamental right, is permissible unless it is unlawful. Speech should not be confined to that which courts feel is most socially useful, and therefore defensible under a "reasonable excuse" defence.
In short, by attempting to capture a wide range of behaviours within a single online offence, with a highly malleable concept of mental distress and wide potential audiences, the offence opens up the potential for a wide
range of legitimate communications to be deemed criminal. Additionally, the problems we identify with the new potential offence may be made worse by the government's proposed Online Harms framework, which will impose a legal
duty over Internet Society Services to exercise a "duty of care" over their users. Given that "mental distress" is very personal and driven by context, this ambiguity could exacerbate the legal uncertainties inherent within the
"duty of care" expectations. If the legal test for the point where mental distress triggers criminal liability is difficult to understand, or to assess content against, this is likely to create an incentive for companies to remove legal content
that is found in the grey areas of "likely audiences" experiencing a "risk" of mental distress in order to successfully carry out their legal duties, and avoid direct risk of regulatory action.
...See full consultation response from openrightsgroup.org
|
|
MPs line up to call for identity checks for all internet users without giving so much as two seconds of thought to consider the consequences for businesses and internet users
|
|
|
| 14th January 2021
|
|
| See transcript from
hansard.parliament.uk See also debate on Parliament TV from parliamentlive.tv |
There was a dreadful debate in Westminster Hall giving the opportunity for a few MPs to call for an end to online anonymity (seeming so that their online social media critics could be pursued). None of these MPs seem to have spent any time whatsoever in
considering the downsides to these policies. |
|
|