Melon Farmers Original Version

Online Safety Act


UK Government legislates to censor social media


 

Harming UK internet business with the Illegal Harms Code...

Ofcom publishes another mountain of expensive and suffocating censorship red tape


Link Here 16th December 2024
Full story: Online Safety Act...UK Government legislates to censor social media
Ofcom writes:

Today we are publishing our first major policy Statement for the Online Safety regime.

This decision on the Illegal Harms Codes and guidance marks a major milestone, with online providers now being legally required to protect their users from illegal harm.

Ofcom published proposals about the steps providers should take to address illegal harms on their services shortly after passage of the Online Safety Act in October 2023. Since then, we have been consulting carefully and widely, listening to industry, charities and campaigners, parents and children, as well as expert bodies and law enforcement agencies. With today's publication1, online providers must take action to start to comply with these new rules. The result will be a safer life online for people in the UK, especially children.

Providers now have a duty to assess the risk of illegal harms on their services, with a deadline of 16 March 2025. Subject to the Codes completing the Parliamentary process, from 17 March 2025, providers will need to take the safety measures set out in the Codes or use other effective measures to protect users from illegal content and activity. We are ready to take enforcement action if providers do not act promptly to address the risks on their services.

Analysis to follow but there are over 1000 pages to get through first!

 

 

Categorised as a mountain of suffocating censorial red tape...

Ofcom proposes definitions for which websites will be subjected to the most onerous censorship rules defined in the Online Safety Act


Link Here31st March 2024
Full story: Online Safety Act...UK Government legislates to censor social media
Ofcom writes:

Ofcom is seeking evidence to inform our codes of practice and guidance on the additional duties that will apply to some of the most widely used online sites and apps -- designated as categorised services - under the Online Safety Act.

Under the new laws, all in-scope tech firms must put in place appropriate safety measures to protect users from online harms. In addition, some online services will have to comply with extra requirements if they fall into one of three categories, known as Category 1, 2A or 2B.

These extra duties include giving users more tools to control what content they see, ensuring protections for news publisher and journalistic content, preventing fraudulent advertising and producing transparency reports. Different duties apply, depending on which category a service falls into.

The Act requires us to produce codes of practice and guidance outlining the steps that companies can take to comply with these additional duties. We are inviting evidence from industry, expert groups and other organisations to help inform and shape our approach. A formal consultation on the draft codes and guidance will follow in 2025, taking account of responses to today's call for evidence. Advice to Government on categorisation thresholds

Alongside this, we have also today published our advice to Government on the thresholds which would determine whether or not a service falls into Category 1, 2A or 2B. We advise that:

Category 1 (most onerous): should apply to services which meet either of the following conditions:

  • Condition 1 - uses a content recommender system; and has more than 34 million UK users on the user-to-user part of its service, representing around 50% of the UK population;

  • Condition 2 - allows users to forward or reshare user-generated content; and uses a content recommender system; and has more than 7 million UK users on the user-to-user part of its service, representing circa 10% of the UK population.

Category 2A: should apply to services which meet both of the following criteria:

  • is a search service, but not vertical search service

  • has more than 7 million UK users on the search engine part of its service, representing circa 10% of the UK population.

Category 2B: should apply to services which meet both of the following criteria:

  • allows users to send direct messages;

  • and has more than 3 million UK users on the user-to-user part of the service, representing circa 5% of the UK population.

Taking our advice into consideration, the Secretary of State must set the threshold conditions in secondary legislation. Once passed, we will then gather information, as needed, from regulated services and produce a published register of categorised services.

 

 

Cyberflashing, epilepsy-trolling and 'fake news'...

Parts of the Online Censorship Act have come into force


Link Here 31st January 2024
Full story: Online Safety Act...UK Government legislates to censor social media

Abusers, trolls, and predators online now face a fleet of tough new jailable offences from Wednesday 31 January, as offences for cyberflashing, sending death threats, and epilepsy-trolling are written into the statute book after the Online Safety Act gained Royal Assent.

These new criminal offences will protect people from a wide range of abuse and harm online, including threatening messages, the non-consensual sharing of intimate images known as revenge porn, and sending fake news that aims to cause non-trivial physical or psychological harm.

Dubbed Zach's law, a new offence will also mean online trolls that send or show flashing images electronically with the intention of causing harm to people with epilepsy will be held accountable for their actions and face prison.

Following the campaigning of Love Island star Georgia Harrison, bitter ex-partners and other abusers who share, or threaten to share, intimate images on or offline without the consent of those depicted will face jail time under new offences from today.

Those found guilty of the base offence of sharing an intimate image could be in prison for up to 6 months, or 2 years if it is proven the perpetrator also intended to cause distress, alarm or humiliation, or shared the image to obtain sexual gratification.

Cyberflashing on dating apps, AirDrop and other platforms will also result in perpetrators facing up to two years behind bars where it is done to gain sexual gratification, or to cause alarm, distress or humiliation.

Sending death threats or threatening serious harm online will also carry a jail sentence of up to five years under a new threatening communications offence that will completely outlaw appalling threats made online that would be illegal if said in person.

A new false communications offence will bring internet trolls to justice by outlawing the intentional sending of false information that could cause non-trivial psychological or physical harm to users online. This new offence will bolster the government's strong commitment to clamping down on dangerous disinformation and election interference online.

In the wake of sickening content, often targeted at children, that encourages users to self-harm, a new offence will mean the individuals that post content encouraging or assisting serious self-harm could face up to 5 years behind bars.

While much of the Online Safety Act's protections are intended to hold tech companies and social media platforms to account for the content hosted on their sites, these new offences will apply directly to the individuals sending threatening or menacing messages and bring justice directly to them.

Some of the offences that commence from today will be further bolstered too, when the wide-ranging Criminal Justice Bill completes its passage through Parliament.

 

 

Offsite Article: Online Safety Act 2023...


Link Here4th December 2023
Full story: Online Safety Act...UK Government legislates to censor social media
A summary of the current position of the UK's (anti-)pornographic internet censorship provisions

See article from decoded.legal

 

 

Offsite Article: "You Don't Belong Here!"...


Link Here11th November 2023
Full story: Online Safety Act...UK Government legislates to censor social media
With 1500 pages outlining a mountain of suffocating red tape in the name of internet regulation, Ofcom delivers a message to small British internet companies

See article from webdevlaw.uk

 

 

Online Censorship Act...

The Online Unsafety Bill gets Royal Assent and so becomes law


Link Here29th October 2023
Full story: Online Safety Act...UK Government legislates to censor social media
The Online Safety Bill received Royal Assenton 26th October 2023, heralding a new era of internet censorship.

The new UK internet Ofcom was quick off the mark to outline its timetable for implementing the new censorship regime.

Ofcom has set out our plans for putting online safety laws into practice, and what we expect from tech firms, now that the Online Safety Act has passed. Ofcom writes:

The Act makes companies that operate a wide range of online services legally responsible for keeping people, especially children, safe online. These companies have new duties to protect UK users by assessing risks of harm, and taking steps to address them. All in-scope services with a significant number of UK users, or targeting the UK market, are covered by the new rules, regardless of where they are based.

While the onus is on companies to decide what safety measures they need given the risks they face, we expect implementation of the Act to ensure people in the UK are safer online by delivering four outcomes:

  • stronger safety governance in online firms;

  • online services designed and operated with safety in mind;

  • choice for users so they can have meaningful control over their online experiences; and

  • transparency regarding the safety measures services use, and the action Ofcom is taking to improve them, in order to build trust .

We are moving quickly to implement the new rules

Ofcom will give guidance and set out codes of practice on how in-scope companies can comply with their duties, in three phases, as set out in the Act.

Phase one: illegal harms duties

We will publish draft codes and guidance on these duties on 9 November 2023, including:

  • analysis of the causes and impacts of online harm, to support services in carrying out their risk assessments;

  • draft guidance on a recommended process for assessing risk;

  • draft codes of practice, setting out what services can do to mitigate the risk of harm; and

  • draft guidelines on Ofcom's approach to enforcement.

We will consult on these documents, and plan to publish a statement on our final decisions in Autumn 2024. The codes of practices will then be submitted to the Secretary of State for Science, Innovation and Technology, and subject to their approval, laid before Parliament.

Phase two: child safety, pornography and the protection of women and girls

Child protection duties will be set out in two parts. First, online pornography services and other interested stakeholders will be able to read and respond to our draft guidance on age assurance from December 2023. This will be relevant to all services in scope of Part 5 of the Online Safety Act.

Secondly, regulated services and other interested stakeholders will be able to read and respond to draft codes of practice relating to protection of children, in Spring 2024.

Alongside this, we expect to consult on:

  • analysis of the causes and impacts of online harm to children; and

  • draft risk assessment guidance focusing on children's harms.

We expect to publish draft guidance on protecting women and girls by Spring 2025, when we will have finalised our codes of practice on protection of children.

Phase three: transparency, user empowerment, and other duties on categorised services

A small proportion of regulated services will be designated Category 1, 2A or 2B services if they meet certain thresholds set out in secondary legislation to be made by Government. Our final stage of implementation focuses on additional requirements that fall only on these categorised services. Those requirements include duties to:

  • produce transparency reports;

  • provide user empowerment tools;

  • operate in line with terms of service;

  • protect certain types of journalistic content; and

  • prevent fraudulent advertising.

We now plan to issue a call for evidence regarding our approach to these duties in early 2024 and a consultation on draft transparency guidance in mid 2024.

Ofcom must produce a register of categorised services. We will advise Government on the thresholds for these categories in early 2024, and Government will then make secondary legislation on categorisation, which we currently expect to happen by summer 2024. Assuming this is achieved, we will:

  • publish the register of categorised services by the end of 2024;

  • publish draft proposals regarding the additional duties on these services in early 2025; and

  • issue transparency notices in mid 2025.




 

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys