Melon Farmers Original Version

UK Internet Censorship


2020: Oct-Dec

 2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

 

Harming the internet...

The Government outlines its final plans to introduce new and wide ranging internet censorship laws


Link Here15th December 2020

Digital Secretary Oliver Dowden and Home Secretary Priti Patel have announced the government's final decisions on new internet censorships laws.

  • New rules to be introduced for nearly all tech firms that allow users to post their own content or interact

  • Firms failing to protect people face fines of up to ten per cent of turnover or the blocking of their sites and the government will reserve the power for senior managers to be held liable

  • Popular platforms to be held responsible for tackling both legal and illegal harms

  • All platforms will have a duty of care to protect children using their services

  • Laws will not affect articles and comments sections on news websites, and there will be additional measures to protect free speech

The full government response to the Online Harms White Paper consultation sets out how the proposed legal duty of care on online companies will work in practice and gives them new responsibilities towards their users. The safety of children is at the heart of the measures.

Social media sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The Government is also progressing work with the Law Commission on whether the promotion of self harm should be made illegal.

Tech platforms will need to do far more to protect children from being exposed to harmful content or activity such as grooming, bullying and pornography. This will help make sure future generations enjoy the full benefits of the internet with better protections in place to reduce the risk of harm.

The most popular social media sites, with the largest audiences and high-risk features, will need to go further by setting and enforcing clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological harm to adults. This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice.

Ofcom is now confirmed as the regulator with the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK.

The legislation includes provisions to impose criminal sanctions on senior managers. The government will not hesitate to bring these powers into force should companies fail to take the new rules seriously - for example, if they do not respond fully, accurately and in a timely manner to information requests from Ofcom. This power would be introduced by Parliament via secondary legislation, and reserving the power to compel compliance follows similar approaches in other sectors such as financial services regulation.

The government plans to bring the laws forward in an Online Safety Bill next year and set the global standard for proportionate yet effective regulation. This will safeguard people's rights online and empower adult users to keep themselves safe while preventing companies arbitrarily removing content. It will defend freedom of expression and the invaluable role of a free press, while driving a new wave of digital growth by building trust in technology businesses.

Scope

The new regulations will apply to any company in the world hosting user-generated content online accessible by people in the UK or enabling them to privately or publicly interact with others online.

It includes social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer cloud storage sites and video games which allow online interaction. Search engines will also be subject to the new regulations.

The legislation will include safeguards for freedom of expression and pluralism online - protecting people's rights to participate in society and engage in robust debate.

Online journalism from news publishers' websites will be exempt, as will reader comments on such sites. Specific measures will be included in the legislation to make sure journalistic content is still protected when it is reshared on social media platforms.

Categorised approach

Companies will have different responsibilities for different categories of content and activity, under an approach focused on the sites, apps and platforms where the risk of harm is greatest.

All companies will need to take appropriate steps to address illegal content and activity such as terrorism and child sexual abuse. They will also be required to assess the likelihood of children accessing their services and, if so, provide additional protections for them. This could be, for example, by using tools that give age assurance to ensure children are not accessing platforms which are not suitable for them.

The government will make clear in the legislation the harmful content and activity that the regulations will cover and Ofcom will set out how companies can fulfil their duty of care in codes of practice.

A small group of companies with the largest online presences and high-risk features, likely to include Facebook, TikTok, Instagram and Twitter, will be in Category 1.

These companies will need to assess the risk of legal content or activity on their services with "a reasonably foreseeable risk of causing significant physical or psychological harm to adults". They will then need to make clear what type of "legal but harmful" content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently.

All companies will need mechanisms so people can easily report harmful content or activity while also being able to appeal the takedown of content. Category 1 companies will be required to publish transparency reports about the steps they are taking to tackle online harms.

Examples of Category 2 services are platforms which host dating services or pornography and private messaging apps. Less than three per cent of UK businesses will fall within the scope of the legislation and the vast majority of companies will be Category 2 services.

Exemptions

Financial harms will be excluded from this framework, including fraud and the sale of unsafe goods. This will mean the regulations are clear and manageable for businesses, focus action where there will be most impact, and avoid duplicating existing regulation.

Where appropriate, lower-risk services will be exempt from the duty of care to avoid putting disproportionate demands on businesses. This includes exemptions for retailers who only offer product and service reviews and software used internally by businesses. Email services will also be exempt.

Some types of advertising, including organic and influencer adverts that appear on social media platforms, will be in scope. Adverts placed on an in-scope service through a direct contract between an advertiser and an advertising service, such as Facebook or Google Ads, will be exempt because this is covered by existing regulation.

Private communications

The response will set out how the regulations will apply to communication channels and services where users expect a greater degree of privacy - for example online instant messaging services and closed social media groups which are still in scope.

Companies will need to consider the impact on user privacy and that they understand how company systems and processes affect people's privacy, but firms could, for example, be required to make services safer by design by limiting the ability for anonymous adults to contact children.

Given the severity of the threat on these services, the legislation will enable Ofcom to require companies to use technology to monitor, identify and remove tightly defined categories of illegal material relating to child sexual exploitation and abuse. Recognising the potential impact on user privacy, the government will ensure this is only used as a last resort where alternative measures are not working. It will be subject to stringent legal safeguards to protect user rights.

 

 

Harming the internet...

The Government to unveil plans for its new internet censorship law this week


Link Here13th December 2020

The Times is reporting that the government will announce plans for its upcoming Online Harms internet censorship law on Tuesday.

Ministers will announce plans for a statutory duty of care, which will be enforced by Ofcom, the broadcasting regulator. Companies that fail to meet the duty could face multimillion-pound fines or be blocked from operating in Britain.

However, the legislation will also include measures to protect freedom of speech after concerns were raised in Downing Street that the powers could prompt social media companies to take posts down unnecessarily.

It also seems that the bill will be titles Online Safety rather than Online Harms.

 

 

Harming the internet...

Ofcom consults about its plans to tool up for its new roles as the UK internet censor


Link Here11th December 2020
Ofcom has opened a consultation on its plan to get ready for its likely role as the UK internet censor under the Governments Online Harms legislation. Ofcom writes

We have today published our plan of work for 2021/22. This consultation sets out our goals for the next financial year, and how we plan to achieve them.

We are consulting on this plan of work to encourage discussion with companies, governments and the public.

As part of the Plan of Work publication, we are also holding some virtual events to invite feedback on our proposed plan. These free events are open to everyone, and offer an opportunity to comment and ask questions.

The consultation ends on 5th February 2021.

The Key areas referencing internet censorship are:

Preparing to regulate online harms

3.26 The UK Government has given Ofcom new duties as the regulator for UK -established video - sharing platforms (VSPs) through the transposition of the European -wide Audiovisual Media Services Directive. VSPs are a type of online video service where users can upload and share vide os with members of the public, such as You Tube and TikTok. Ofcom will not be responsible for regulating all VSPs as our duties only apply to services established in the UK and as such , we anticipate that a relatively small number of services fall within our jurisdiction. Under the new regulations, which came into force on 1 November 2020, VSPs must have appropriate measures in place to protect children from potentially harmful content and all users from criminal content and incitement to hatred and violence. VSPs will also need to make sure certain advertising standards are met.

3.27 As well as appointing Ofcom as the regulator of UK- established VSPs the Government has announced that it is minded to appoint Ofcom as the future regulator responsible for protecting users from harmful online content. With this in mind we are undertaking the following work :

  • Video-sharing platforms regulation . We have issued a short guide to the new requirements. 22 On 19 November 2020 we issued draft scope and jurisdiction guidance for consultation to help providers self -assess whether they need to notify to Ofcom as a VSP under the statutory rules from April 2021. 23 We will also consult in early 2021 on further guidance on the risk of harms and appropriate measures as well as proposals for a co-regulatory relationship with the Advertising Standards Authority (ASA) with regards to VSP advertising. We intend to issue final versions of the guidance in summer 2021.

  • Preparing for the online harms regime. The UK Government has set out that it intends to put in place a regime to keep people safe online. In February 2020 it published an initial response to the 2019 White Paper24 setting out how it intends to develop the regime which stated that it was minded to appoint Ofcom as the future regulator of online harms. If confirmed, these proposed new responsibilities would constitute a significant expansion to our remit, and preparing for them would be a major area of focus in 2021/22. We will continue to provide technical advice to the UK Government on its policy development process, and we will engage with Parliament as it considers legislative proposals.

3.29 We will continue work to deepen our understanding of online harms through a range of work:

  • Our Making Sense of Media programme. This programme will continue to provide insights on the needs, behaviours and attitudes of people online. Our other initiatives to research online markets and technologies will further our understanding of how online harms can be mitigated

  • Stepping up our collaboration with other regulators. As discussed in the Developing strong partnerships section, we will continue our joint work through the Digital Regulators Cooperation Forum and strengthen our collaboration with regulators around the world who are also considering online harms.

  • Understanding VSPs . The introduction of regulation to UK-established VSPs will provide a solid foundation to inform and develop the broader future online harms regulatory framework. This interim regime is more limited in terms of the number of regulated companies and will cover a narrower range of harms compared to the online harms white paper proposals. However, should Ofcom be confirmed as the regulator, through our work on VSPs we will develop on-the-job experience working with newly regulated online services, developing the evidence base of online harm, and building our internal skills and expertise.

 

 

Offsite Article: Streaming Platforms: Age Ratings...


Link Here8th December 2020
The House of Lords discusses how streaming services use age ratings

See article from hansard.parliament.uk

 

 

Shared video censorship...

House of Lords approves adoption of the EU's internet video sharing censorship laws into post Brexit UK law


Link Here29th November 2020
The House of Lords approved a statutory instrument that adopts the EU's Audio Visual Media Services Directive into post-Brexit UK law. This law describes state censorship requirements for internet video sharing platforms.

The law change was debated on 27th November 2020 with the government introducing the law as follows:

Baroness Barran, The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My Lords, I am pleased to introduce this instrument, laid in both Houses on 15 October, which is being made under the European Union (Withdrawal) Act 2018. These regulations remedy certain failures of retained EU law arising from the withdrawal of the United Kingdom from the EU. This instrument seeks to maintain, but not expand, Ofcom's remit to regulate video-sharing platform services. This intervention is necessary to ensure the law remains operable beyond the end of the transition period.

The EU's audiovisual media services directive, known as the AVMS directive, governs the co-ordination of national legislation on audio-visual media services. The AVMS directive was initially implemented into UK law in 2010, primarily by way of amendments to UK broadcasting legislation. The directive was subsequently revised in 2018. The UK Audiovisual Media Services Regulations 2020, which transposed the revised AVMS directive, were made and laid in Parliament on 30 September. Those regulations came into force on 1 November and introduced, for the first time, rules for video-sharing platform services. The Government have appointed Ofcom as the regulator for these services. The new rules ensure that platforms falling within UK jurisdiction have appropriate systems and processes to protect the public, including minors, from illegal and harmful material.

There were three key requirements placed on video-sharing platforms under the regulations. These were: to take appropriate measures to protect minors under 18 from harmful content, to take appropriate measures to protect the general public from harmful and certain illegal content, and to introduce standards around advertising. I also draw the attention of the House to the report from the Secondary Legislation Scrutiny Committee considering this instrument, and I thank its members for their work.

I will now address the committee's concerns regarding jurisdiction. The AVMS directive sets out technical rules governing when a platform falls within a country's jurisdiction. First, there must be a physical presence, or a group undertaking, of the platform in the country. Where there is a physical presence in more than one country, jurisdiction is decided on the basis of factors such as whether the platform is established in that country, whether the platform's main economic activity is centred in that country, and the hierarchy of group undertakings as set out by the directive.

Under the revised AVMS directive, each EU member state and the UK is responsible for regulating only the video-sharing platforms that fall within its jurisdiction. There will be only one country that has jurisdiction for each platform at any one time. However, if a platform has no physical presence in any country covered by the AVMS directive, then no country will have jurisdiction over it, even if the platform provides services in those countries.

Through this instrument, we are seeking to maintain the same position for Ofcom's remit beyond the end of the transition period. This position allows Ofcom to regulate video-sharing platforms established in the UK and additionally regulate platforms that have a physical presence in the UK but not in any other country covered by the AVMS directive. Although Ofcom's remit will not be extended to include platforms established elsewhere in the EU, we believe UK users will indirectly benefit from the EU's regulation of platforms under the AVMS directive. The regulation under this regime is systems regulation, not content regulation. We therefore expect that as platforms based outside of the UK will set up and invest in systems to comply with the AVMS regulations, it is probable that these same systems will also be introduced for their UK subsidiaries.

In the absence of this instrument, Ofcom would no longer be able to regulate any video-sharing platforms. This would result in an unacceptable regulatory gap and a lack of protection for UK users using these services. Our approach also mitigates the small risk that a video- sharing platform offering services to countries covered by the AVMS directive, but not the UK, would establish itself in the UK in order to circumvent EU law.

While we recognise that most children have a positive experience online, the reality is that the impact of harmful content and activity online can be particularly damaging for children. Over three-quarters of UK adults also express a deep concern about the internet. The UK is one of only three countries to have transposed the revised directive thus far, evidencing our commitment to protecting users online.

These regulations also pave the way for the upcoming online harms regulatory regime. Given that the online harms regulatory framework shares broadly the same objectives as the video-sharing platform regime, it is the Government's intention that the regulation of video-sharing platforms in the UK will be superseded by the online harms legislation, once the latter comes into force. Further details on the plans for online harms regulation will be set out in the full government response to the consultation on the Online Harms White Paper, which is due to be published later this year, with draft legislation ready in early 2021. With that, I beg to move.

 

 

A bitter pill...

The government consults on banning all advertising for food that tastes good enforced by onerous new censorship and red tape requirements that will strangle British companies whilst advantaging US corporate giants


Link Here12th November 2020
The UK Government writes:

We want your views on our proposal for a total online advertising restriction for HFSS (high in fat, salt or suger) products to reduce the amount of HFSS advertising children are exposed to online.

This consultation closes at

Consultation description

We're asking questions on:

  • what types of advertising will be restricted

  • who will be liable for compliance

  • enforcement of the restrictions

In 2019 the government consulted on restricting advertising of HFSS for TV and online . It asked for views on whether to extend current advertising restrictions on broadcast TV and online media, including consulting on watershed restrictions. In July 2020 the government confirmed its intention to introduce a 9pm watershed on TV .

This new consultation goes further and looks at how a total HFSS advertising restriction could be implemented online. It should be read with the 2019 consultation.

 

 

Updated: Likes vs 'should' likes...

TikTok to explain their 'algorithms' to a UK a parliamentary committee


Link Here10th November 2020
Full story: TikTok Censorship...Chinese ownership adds to the usual social media censorship
The upcoming social media website Tiktok will allow UK politicians to review its algorithm, after MPs challenged the firm over censorship concerns and ties to the Chinese government.

Tiktok's UK director of government relations and public policy Elizabeth Kanter said members of the Business Select Committee were welcome to visit its transparency centre, to review its algorithm and the way it moderates content.

Of course the very  idea of algorithms has evolved into some sort of assumption that they are a sinister means of corrupting the weak minds of social media users. In reality they are probably closer to something simple like:

Give 'em more of what they like and don't bother wasting their time with 'worthy' content that they 'should' like, because they'll only skim over it anyway.

Kanter claimed that the app no longer moderates content based on political sensitivities or affiliation. She said:

We do not censor content, I would encourage you to open the app and search for Tiananmen square, search for Uygher, search for Tibet -- you will find that content on Tik Tok.

Kanter reiterated the company's claim that it would not share any data with its Chinese parent company Bytedance or with the Chinese authorities.

TikTok has also announced that it is upping its censorship of political content. In a blog post, the app said it was expanding its policy to take into account coded language and symbols used to spread hateful ideologies:

Tiktok already removes content related to neo-Nazism and white supremacy, but will now also ban similar ideologies such as white nationalism, white genocide theory, Identitarianism and male supremacy.

Update: And on the subject of the repression of Uyghur muslims

10th November 2020. See article from digitalmusicnews.com

A TikTok executive admitted to UK lawmakers that the platform censors anti-Chinese content. The statement was made during a hearing held by the UK's Business, Energy, and Industrial Strategy Committee. Elizabeth Kanter, TikTok's Director of Government Relations and Public Policy, made the damning comments.

The UK hearing was held to determine whether businesses in the UK are exploiting forced labor in those Xinjiang camps. Kanter initially told the committee that TikTok does not censor content . But when pressed about those previous incidents of censorship on the platform, Kanter admitted something different. She said those videos were removed in the early days of TikTok when content was governed by different guidelines. She said:

The people who wrote the content guidelines took a decision to not allow conflict on the platform, and so there were some incidents where content was not allowed on the platform, specifically with regard to the Uyghur situation.

Kanter later backtracked on her comments claiming that she had misspoken.

 

 

Commented: A disgraceful disregard of free speech...

Scottish MSPs point out that a new blasphemy bill will apply to speech in people's private homes


Link Here 2nd November 2020
Full story: Scotland stifles free speech...Hate Crime & Public Order Act
The disgraceful new Scottish hate crime and blasphemy bill will criminalise free speech in people's own homes, MSPs have been told.

MSPs questioned Scottish 'Justice' Secretary Humza Yousaf over the censorship legislation during an evidence session before the Holyrood Justice Committee. The new proposed legislation will introduce a stirring-up of hate offence on characteristics including religion, and sexual orientation.

However critics note that the Hate Crime and Public Order Bill, which centres around plans for a new offence of stirring up hatred, will stifle freedom of expression.

BBC Scotland, Catholic bishops, the Humanist Society of Scotland, and the Scottish Police Federation are amongst those to have raised concerns, along with Mr Bean star Rowan Atkinson and writer Val McDermid.

Because of this, Yousaf was forced to moderate the legislation and marginally change the controversial stirring up offences section which has been condemned by opponents. It now means stirring up offences would be limited to intent relating to age, disability, religion, sexual orientation, transgender identity and variations in sex characteristics and therefore prosecutions could only be brought in this respect.

Liam Kerr MSP, Scottish Conservative Justice Spokesman, added: The Hate Crime Bill was a mess when the SNP first brought it to parliament and it still contains serious issues that need to be fixed. He said:

Tinkering around the margins will not fix the most controversial bill in Scottish Parliament history.

This latest admission from the justice secretary confirms what so many respondents to the consultation have warned 203 that as drafted, this Bill means free speech could be criminalised within the home with friends you've invited over for a dinner party, and that Mr Yousaf is perfectly comfortable with that.

The SNP need to be clear with the Scottish public about exactly what they intend this Hate Crime Bill to do.

They can't keep trying to force through dangerous attacks on freedom of speech.

Update: Stronger free speech protection needed over hate crime bill urges the National Secular Society

31st October 2020. See article from secularism.org.uk

The National Secular Society has urged the Scottish government to ensure freedom of expression is adequately protected after ministers hinted at possible concessions in a bill on hate crime.

NSS chief executive Stephen Evans met with government representatives on Thursday and urged them to reconsider plans to criminalise stirring up hatred on various grounds, including religion.

Mr Evans warned that the vague and highly subjective wording in the bill risked chilling free speech and sending the message that the law was there to protect people from being offended.

Part of the bill, which is currently making its way through the Scottish parliament, would criminalise behaviour deemed threatening or abusive and intended to stir up hatred.

As part of its case the NSS argued that protections for free speech in the relevant section of the bill should be at least as strong as their equivalents in England and Wales.

Offsite Comment: Scotland is leading the way to totalitarianism

2nd November 2020. See article from unherd.com by Rod Dreher

 A bill brought forth by the SNP aims to police what citizens say at home

 

 

Harming hopes of a trade deal...

The Telegraph outlines the latest state of play in the government's upcoming internet censorship bill


Link Here26th October 2020
The Telegraph has reported on the current government thinking about its news internet censorship bill that it refers to as the Online Harms Bill.

Another update will be published after the US elections suggesting that the government's plans for internet censorship are abound up in negotiations for a US trade deal and the amount of scope for censorship will depend on whether Donald Trump or Joe Biden is in charge.

The Online Harms Bill is set to require websites and apps with user interaction to agree legally-binding terms and conditions that lock them into a rather vaguely define 'duty of care'.

Culture Secretary Oliver Dowden -- who has presented the plan to Number 10 with Home Secretary Priti Patel -- has pledged the firms' codes to tackle content such as self-harm and eating disorders will have to be meaningful and vetted by the new internet censor Ofcom to ensure they are proper and effective.

The current proposals are thought to stop short of criminal sanctions against the firms for breaches over legal but harmful content like self-harm videos, but named executives will be held accountable for companies' policies and face fines and disqualification for breaches. Criminal sanctions will be reserved for illegal online material such as child abuse and terrorism.

The proposals, set out as a response to the consultation on last year's white paper , are expected to be published after the US elections, once agreed by the Prime Minister.

The Government is expected to draft a tight duty of care bill early next year that will lay down the sanctions and investigative powers of the new regulator but leave the scope of the duty of care on legal harms to secondary legislation to be voted on by MPs.

 

 

Shared burdens...

Ofcom publishes its censorship guidelines to be applied to UK based video sharing platforms


Link Here21st October 2020
Ofcom has published its burdensome censorship rules that will apply to video sharing platforms that are stupid enough to be based in the UK. In particular the rules are quite vague about age verification requirements for the two adult video sharing sites that remain in the UK. Maybe Ofcom is a bit shy about requiring onerous and unviable red tape of British companies trying to compete with large numbers of foreign companies that operate with a massive commercial advantage of not having age verification.

Ofcom do however note that these censorship rules are a stop gap until a wider scoped 'online harms' censorship regime which will start up in the next couple of years.

Ofcom writes:

Video-sharing platforms (VSPs) are a type of online video service which allows users to upload and share videos with members of the public.

From 1 November 2020, UK-established VSPs will be required to comply with new rules around protecting users from harmful content.

The main purpose of the new regulatory regime is to protect consumers who engage with VSPs from the risk of viewing harmful content. Providers must have appropriate measures in place to protect minors from content which might impair their physical, mental or moral development; and to protect the general public from criminal content and material likely to incite violence or hatred.

Ofcom has published a short guide outlining the new statutory requirements on providers. The guide is intended to assist platforms to determine whether they fall in scope of the new regime and to understand what providers need to do to ensure their services are compliant.

The guide also explains how Ofcom expects to approach its new duties in the period leading up to the publication of further guidance on the risk of harms and appropriate measures, which we will consult on in early 2021.

Ofcom will also be consulting on guidance on scope and jurisdiction later in 2020. VSP providers will be required to notify their services to Ofcom from 6 April 2021 and we expect to have the final guidance in place ahead of this time.


 2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 
 

 
UK News

UK Internet

UK TV

UK Campaigns

UK Censor List
ASA

BBC

BBFC

ICO

Ofcom
Government

Parliament

UK Press

UK Games

UK Customs


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys