Melon Farmers Original Version

Online Harms White Paper


UK Government seeks to censor social media


 

Comment: Vague rules defined so that the censors can then arbitrarily dictate what we are allowed to say...

Ofcom's vague censorship rules will encroach on free speech says the National Secular Society


Link Here9th May 2021
Full story: Online Harms White Paper...UK Government seeks to censor social media

Proposed rules which would protect users from material likely to incite hatred on video sharing platforms (VSPs) risk unduly encroaching on freedom of expression, the National Secular Society has said.

The internet censor Ofcom is currently consulting on draft guidance for VSP providers, which is designed to protect consumers from viewing harmful content.

The NSS has responded to the consultation to express concern that Ofcom has not ensured adequate protection for freedom of expression.

The guidance says providers must ensure appropriate measures are in place to protect children from material that might impair their development; and to protect the public from criminal content and material likely to incite hatred.

The NSS said it was reasonable for Ofcom to require VSPs to moderate restricted material, but warned that rules around material likely to incite hatred were too vague.

The guidance requires moderation of material likely to incite hatred on various grounds, including religion or belief and political or any other opinion. It only defines incitement to hatred by saying it should be understood as having its usual meaning in everyday language.

The NSS said hatred was a largely subjective term and different individuals would have different interpretations of its meaning. The society raised concerns that the guidance would enable religious censorship, and highlighted examples of religious groups attempting to use claims of inciting hatred to censor critical or satirical material .

The NSS also said platforms would be incentivised to err on the side of censorship. The guidance includes significant penalties for permitting content that violates the rules, while there are no equivalent penalties for failing to uphold freedom of expression.

The NSS also said the guidance could unreasonably curb religious groups' freedom of expression.

The society said the guidance should include clearer instruction on VSPs' duty to have due regard for freedom of speech and freedom of religion or belief. It added that it should include more comprehensive explanations about what does not constitute material that is considered likely to incite hatred with regard to religion or belief.

 

 

Offsite Article: In an age where one can share a year's worth of porn on a single memory stick...


Link Here6th May 2021
Full story: Online Harms White Paper...UK Government seeks to censor social media
German academic publishes a survey confirming that 16 and 17 year olds are keen porn viewers and suggests that they will surely find ways to work around age verification

See article from onlinelibrary.wiley.com

 

 

Offsite Article: Thought crimes against political correctness vs real crimes against real people...


Link Here7th April 2021
Full story: Online Harms White Paper...UK Government seeks to censor social media
Government can't tackle online harm without cracking down on online scams by Jeff Smith MP

See article from politicshome.com

 

 

No comments...

Government notes that porn websites without user comments or uploads will not be within the censorship regime of the upcoming Online Safety Bill


Link Here27th March 2021
Full story: Online Harms White Paper...UK Government seeks to censor social media
Written Question, answered on 24 March 2021

Baroness Grender Liberal Democrat Life peer Lords

To ask Her Majesty's Government which commercial pornography companies will be in scope of the Online Safety Bill; and whether commercial pornography websites which

  1. do not host user-generated content, or

  2. allow private user communication, will also be in scope.

Baroness Barran Conservative

The government is committed to ensuring children are protected from accessing online pornography through the new online safety framework. Where pornography sites host user-generated content or facilitate online user interaction such as video and image sharing, commenting and live streaming, they will be subject to the new duty of care. Commercial pornography sites which allow private user to user communication will be in scope. Where commercial pornography sites do not have user-generated functionality they will not be in scope. The online safety regime will capture both the most visited pornography sites and pornography on social media, therefore covering the majority of sites where children are most likely to be exposed to pornography.

We expect companies to use age assurance or age verification technologies to prevent children from accessing services which pose the highest risk of harm to children, such as online pornography. We are working closely with stakeholders across industry to establish the right conditions for the market to deliver age assurance and age verification technical solutions ahead of the legislative requirements coming into force.

 

 

MPs identified as totally uncaring for the safety of internet users...

MPs who don't like being insulted on Twitter line up to call for all users to hand over identifying personal details to the likes of Google, Facebook and the Chinese government


Link Here24th March 2021
Full story: Online Harms White Paper...UK Government seeks to censor social media
Online Anonymity was debated in the House of Commons debated on Wednesday 13 January 2021.

The long debate was little more than a list of complaints from MPs aggrieved at aggressive comments on social media, often against themselves.

As always seems to be the case with parliamentary debate, it turned into a a long calls of 'something must be done', and hardly comment thinking about the likely and harmful consequences of what they are calling for.

As an example here is part of the complaint from debate opener, Siobhan Bailiie:

The new legislative framework for tech companies will create a duty of care to their users. The legislation will require companies to prevent the proliferation of illegal content and activity online, and ensure that children who use their services are not exposed to harmful content. As it stands, the tech companies do not know who millions of their users are, so they do not know who their harmful operators are, either. By failing to deal with anonymity properly, any regulator or police force, or even the tech companies themselves, will still need to take extensive steps to uncover the person behind the account first, before they can tackle the issue or protect a user.

The Law Commission acknowledged that anonymity often facilitates and encourages abusive behaviours. It said that combined with an online disinhibition effect, abusive behaviours, such as pile-on harassment, are much easier to engage in on a practical level. The Online Harms White Paper focuses on regulation of platforms and the Law Commission's work addresses the criminal law provisions that apply for individuals. It is imperative, in my view, that the Law Commission's report and proposals are fully debated prior to the online harms Bill passing through Parliament. They should go hand in hand.

Standing in Parliament, I must mention that online abuse is putting people off going into public service and speaking up generally. One reason I became interested in this subject was the awful abuse I received for daring to have a baby and be an MP. Attacking somebody for being a mum or suggesting that a mum cannot do this job is misogynistic and, quite frankly, ridiculous, but I would be lying if I said that I did not find some of the comments stressful and upsetting, particularly given I had just had a baby.

Is there a greater impediment to freedom of expression than a woman being called a whore online or being told that she will be raped for expressing a view? It happens. It happens frequently and the authors are often anonymous. Fantastic groups like 50:50 Parliament, the Centenary Action Group, More United and Compassion in Politics are tackling this head on to avoid men and women being put off running for office. One of the six online harm asks from Compassion in Politics is to significantly reduce the prevalence and influence of anonymous accounts online.

The Open Rights Group said more about consequences in a short email than the MPs said in a hour of debate:

Mandatory ID verification would open a Pandora's Box of unintended consequences. A huge burden would be placed on site administrators big and small to own privatised databases of personally identifiable data. Large social media platforms would gain ever more advantage over small businesses, open source projects and startups that lack the resources to comply.

Requirements for formal documentation, such as a bank account, to verify ID would disenfranchise those on low incomes, the unbanked, the homeless, and people experiencing other forms of social exclusion. Meanwhile, the fate of countless accounts and astronomical amounts of legal content would be thrown into jeopardy overnight.

 

 

Offsite Article: Speech should be free but not of consequences...


Link Here 25th February 2021
Full story: Online Harms White Paper...UK Government seeks to censor social media
Rather than genuinely tackling the thornier issues, we're seeing calls for more regulations online as a quick fix. By Ruth Smeeth

See article from indexoncensorship.org

 

 

And through the square window...

Floella Benjamin attempts to resuscitate internet porn age verification in a Domestic Abuse Bill


Link Here 11th February 2021
Full story: Online Harms White Paper...UK Government seeks to censor social media
Campaigners for the revival of deeply flawed and one sided age verification for porn scheme have been continuing their efforts to revive it ever since it was abandoned by the Government in October 2019.

The Government was asked about the possibility of restoring it in January 2021 in the House of Commons. Caroline Dinenage responded for the government:

The Government announced in October 2019 that it will not commence the age verification provisions of Part 3 of the Digital Economy Act 2017 and instead deliver these protections through our wider online harms regulatory proposals.

Under our online harms proposals, we expect companies to use age assurance or age verification technologies to prevent children from accessing services which pose the highest risk of harm to children, such as online pornography. The online harms regime will capture both the most visited pornography sites and pornography on social media, therefore covering the vast majority of sites where children are most likely to be exposed to pornography. Taken together we expect this to bring into scope more online pornography currently accessible to children than would have been covered by the narrower scope of the Digital Economy Act.

We would encourage companies to take steps ahead of the legislation to protect children from harmful and age inappropriate content online, including online pornography. We are working closely with stakeholders across industry to establish the right conditions for the market to deliver age assurance and age verification technical solutions ahead of the legislative requirements coming into force.

In addition, Regulations transposing the revised Audiovisual Media Services Directive came into force on 1 November 2020 which require UK-established video sharing platforms to take appropriate measures to protect minors from harmful content. The Regulations require that the most harmful content is subject to the strongest protections, such as age assurance or more technical measures. Ofcom, as the regulatory authority, may take robust enforcement action against video sharing platforms which do not adopt appropriate measures.

Now during the passage of the Domestic Abuse in the House of Lords, Floella Benjamin attempted to revive the age verification requirement by proposing the following amendment:

Insert the following new Clause --

Impact of online pornography on domestic abuse

(1) Within three months of the day on which this Act is passed, the Secretary of State must commission a person appointed by the Secretary of State to investigate the impact of access to online pornography by children on domestic abuse.

(2) Within three months of their appointment, the appointed person must publish a report on the investigation which may include recommendations for the Secretary of State.

(3) As part of the investigation, the appointed person must consider the extent to which the implementation of Part 3 of the Digital Economy Act 2017 (online pornography) would prevent domestic abuse, and may make recommendations to the Secretary of State accordingly.

(4) Within three months of receiving the report, the Secretary of State must publish a response to the recommendations of the appointed person.

(5) If the appointed person recommends that Part 3 of the Digital Economy Act 2017 should be commenced, the Secretary of State must appoint a day for the coming into force of that Part under section 118(6) of the Act within the timeframe recommended by the appointed person.

Member's explanatory statement

This amendment would require an investigation into any link between online pornography and domestic abuse with a view to implementing recommendations to bring into effect the age verification regime in the Digital Economy Act 2017 as a means of preventing domestic abuse.

Floella Benjamin made a long speech supporting the censorship measure and was supported by a number of peers. Of course they all argued only from the 'think of the children' side of the argument and not one of them mentioned trashed adult businesses and the risk to porn viewers of being outed, scammed, blackmailed etc.

See Floella Benjamin's speech from hansard.parliament.uk

 

 

Harming the internet...

The Government outlines its final plans to introduce new and wide ranging internet censorship laws


Link Here15th December 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media

Digital Secretary Oliver Dowden and Home Secretary Priti Patel have announced the government's final decisions on new internet censorships laws.

  • New rules to be introduced for nearly all tech firms that allow users to post their own content or interact

  • Firms failing to protect people face fines of up to ten per cent of turnover or the blocking of their sites and the government will reserve the power for senior managers to be held liable

  • Popular platforms to be held responsible for tackling both legal and illegal harms

  • All platforms will have a duty of care to protect children using their services

  • Laws will not affect articles and comments sections on news websites, and there will be additional measures to protect free speech

The full government response to the Online Harms White Paper consultation sets out how the proposed legal duty of care on online companies will work in practice and gives them new responsibilities towards their users. The safety of children is at the heart of the measures.

Social media sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The Government is also progressing work with the Law Commission on whether the promotion of self harm should be made illegal.

Tech platforms will need to do far more to protect children from being exposed to harmful content or activity such as grooming, bullying and pornography. This will help make sure future generations enjoy the full benefits of the internet with better protections in place to reduce the risk of harm.

The most popular social media sites, with the largest audiences and high-risk features, will need to go further by setting and enforcing clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological harm to adults. This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice.

Ofcom is now confirmed as the regulator with the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK.

The legislation includes provisions to impose criminal sanctions on senior managers. The government will not hesitate to bring these powers into force should companies fail to take the new rules seriously - for example, if they do not respond fully, accurately and in a timely manner to information requests from Ofcom. This power would be introduced by Parliament via secondary legislation, and reserving the power to compel compliance follows similar approaches in other sectors such as financial services regulation.

The government plans to bring the laws forward in an Online Safety Bill next year and set the global standard for proportionate yet effective regulation. This will safeguard people's rights online and empower adult users to keep themselves safe while preventing companies arbitrarily removing content. It will defend freedom of expression and the invaluable role of a free press, while driving a new wave of digital growth by building trust in technology businesses.

Scope

The new regulations will apply to any company in the world hosting user-generated content online accessible by people in the UK or enabling them to privately or publicly interact with others online.

It includes social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer cloud storage sites and video games which allow online interaction. Search engines will also be subject to the new regulations.

The legislation will include safeguards for freedom of expression and pluralism online - protecting people's rights to participate in society and engage in robust debate.

Online journalism from news publishers' websites will be exempt, as will reader comments on such sites. Specific measures will be included in the legislation to make sure journalistic content is still protected when it is reshared on social media platforms.

Categorised approach

Companies will have different responsibilities for different categories of content and activity, under an approach focused on the sites, apps and platforms where the risk of harm is greatest.

All companies will need to take appropriate steps to address illegal content and activity such as terrorism and child sexual abuse. They will also be required to assess the likelihood of children accessing their services and, if so, provide additional protections for them. This could be, for example, by using tools that give age assurance to ensure children are not accessing platforms which are not suitable for them.

The government will make clear in the legislation the harmful content and activity that the regulations will cover and Ofcom will set out how companies can fulfil their duty of care in codes of practice.

A small group of companies with the largest online presences and high-risk features, likely to include Facebook, TikTok, Instagram and Twitter, will be in Category 1.

These companies will need to assess the risk of legal content or activity on their services with "a reasonably foreseeable risk of causing significant physical or psychological harm to adults". They will then need to make clear what type of "legal but harmful" content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently.

All companies will need mechanisms so people can easily report harmful content or activity while also being able to appeal the takedown of content. Category 1 companies will be required to publish transparency reports about the steps they are taking to tackle online harms.

Examples of Category 2 services are platforms which host dating services or pornography and private messaging apps. Less than three per cent of UK businesses will fall within the scope of the legislation and the vast majority of companies will be Category 2 services.

Exemptions

Financial harms will be excluded from this framework, including fraud and the sale of unsafe goods. This will mean the regulations are clear and manageable for businesses, focus action where there will be most impact, and avoid duplicating existing regulation.

Where appropriate, lower-risk services will be exempt from the duty of care to avoid putting disproportionate demands on businesses. This includes exemptions for retailers who only offer product and service reviews and software used internally by businesses. Email services will also be exempt.

Some types of advertising, including organic and influencer adverts that appear on social media platforms, will be in scope. Adverts placed on an in-scope service through a direct contract between an advertiser and an advertising service, such as Facebook or Google Ads, will be exempt because this is covered by existing regulation.

Private communications

The response will set out how the regulations will apply to communication channels and services where users expect a greater degree of privacy - for example online instant messaging services and closed social media groups which are still in scope.

Companies will need to consider the impact on user privacy and that they understand how company systems and processes affect people's privacy, but firms could, for example, be required to make services safer by design by limiting the ability for anonymous adults to contact children.

Given the severity of the threat on these services, the legislation will enable Ofcom to require companies to use technology to monitor, identify and remove tightly defined categories of illegal material relating to child sexual exploitation and abuse. Recognising the potential impact on user privacy, the government will ensure this is only used as a last resort where alternative measures are not working. It will be subject to stringent legal safeguards to protect user rights.

 

 

Harming the internet...

The Government to unveil plans for its new internet censorship law this week


Link Here13th December 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media

The Times is reporting that the government will announce plans for its upcoming Online Harms internet censorship law on Tuesday.

Ministers will announce plans for a statutory duty of care, which will be enforced by Ofcom, the broadcasting regulator. Companies that fail to meet the duty could face multimillion-pound fines or be blocked from operating in Britain.

However, the legislation will also include measures to protect freedom of speech after concerns were raised in Downing Street that the powers could prompt social media companies to take posts down unnecessarily.

It also seems that the bill will be titles Online Safety rather than Online Harms.

 

 

Harming the internet...

Ofcom consults about its plans to tool up for its new roles as the UK internet censor


Link Here11th December 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
Ofcom has opened a consultation on its plan to get ready for its likely role as the UK internet censor under the Governments Online Harms legislation. Ofcom writes

We have today published our plan of work for 2021/22. This consultation sets out our goals for the next financial year, and how we plan to achieve them.

We are consulting on this plan of work to encourage discussion with companies, governments and the public.

As part of the Plan of Work publication, we are also holding some virtual events to invite feedback on our proposed plan. These free events are open to everyone, and offer an opportunity to comment and ask questions.

The consultation ends on 5th February 2021.

The Key areas referencing internet censorship are:

Preparing to regulate online harms

3.26 The UK Government has given Ofcom new duties as the regulator for UK -established video - sharing platforms (VSPs) through the transposition of the European -wide Audiovisual Media Services Directive. VSPs are a type of online video service where users can upload and share vide os with members of the public, such as You Tube and TikTok. Ofcom will not be responsible for regulating all VSPs as our duties only apply to services established in the UK and as such , we anticipate that a relatively small number of services fall within our jurisdiction. Under the new regulations, which came into force on 1 November 2020, VSPs must have appropriate measures in place to protect children from potentially harmful content and all users from criminal content and incitement to hatred and violence. VSPs will also need to make sure certain advertising standards are met.

3.27 As well as appointing Ofcom as the regulator of UK- established VSPs the Government has announced that it is minded to appoint Ofcom as the future regulator responsible for protecting users from harmful online content. With this in mind we are undertaking the following work :

  • Video-sharing platforms regulation . We have issued a short guide to the new requirements. 22 On 19 November 2020 we issued draft scope and jurisdiction guidance for consultation to help providers self -assess whether they need to notify to Ofcom as a VSP under the statutory rules from April 2021. 23 We will also consult in early 2021 on further guidance on the risk of harms and appropriate measures as well as proposals for a co-regulatory relationship with the Advertising Standards Authority (ASA) with regards to VSP advertising. We intend to issue final versions of the guidance in summer 2021.

  • Preparing for the online harms regime. The UK Government has set out that it intends to put in place a regime to keep people safe online. In February 2020 it published an initial response to the 2019 White Paper24 setting out how it intends to develop the regime which stated that it was minded to appoint Ofcom as the future regulator of online harms. If confirmed, these proposed new responsibilities would constitute a significant expansion to our remit, and preparing for them would be a major area of focus in 2021/22. We will continue to provide technical advice to the UK Government on its policy development process, and we will engage with Parliament as it considers legislative proposals.

3.29 We will continue work to deepen our understanding of online harms through a range of work:

  • Our Making Sense of Media programme. This programme will continue to provide insights on the needs, behaviours and attitudes of people online. Our other initiatives to research online markets and technologies will further our understanding of how online harms can be mitigated

  • Stepping up our collaboration with other regulators. As discussed in the Developing strong partnerships section, we will continue our joint work through the Digital Regulators Cooperation Forum and strengthen our collaboration with regulators around the world who are also considering online harms.

  • Understanding VSPs . The introduction of regulation to UK-established VSPs will provide a solid foundation to inform and develop the broader future online harms regulatory framework. This interim regime is more limited in terms of the number of regulated companies and will cover a narrower range of harms compared to the online harms white paper proposals. However, should Ofcom be confirmed as the regulator, through our work on VSPs we will develop on-the-job experience working with newly regulated online services, developing the evidence base of online harm, and building our internal skills and expertise.

 

 

Shared video censorship...

House of Lords approves adoption of the EU's internet video sharing censorship laws into post Brexit UK law


Link Here 29th November 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
The House of Lords approved a statutory instrument that adopts the EU's Audio Visual Media Services Directive into post-Brexit UK law. This law describes state censorship requirements for internet video sharing platforms.

The law change was debated on 27th November 2020 with the government introducing the law as follows:

Baroness Barran, The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My Lords, I am pleased to introduce this instrument, laid in both Houses on 15 October, which is being made under the European Union (Withdrawal) Act 2018. These regulations remedy certain failures of retained EU law arising from the withdrawal of the United Kingdom from the EU. This instrument seeks to maintain, but not expand, Ofcom's remit to regulate video-sharing platform services. This intervention is necessary to ensure the law remains operable beyond the end of the transition period.

The EU's audiovisual media services directive, known as the AVMS directive, governs the co-ordination of national legislation on audio-visual media services. The AVMS directive was initially implemented into UK law in 2010, primarily by way of amendments to UK broadcasting legislation. The directive was subsequently revised in 2018. The UK Audiovisual Media Services Regulations 2020, which transposed the revised AVMS directive, were made and laid in Parliament on 30 September. Those regulations came into force on 1 November and introduced, for the first time, rules for video-sharing platform services. The Government have appointed Ofcom as the regulator for these services. The new rules ensure that platforms falling within UK jurisdiction have appropriate systems and processes to protect the public, including minors, from illegal and harmful material.

There were three key requirements placed on video-sharing platforms under the regulations. These were: to take appropriate measures to protect minors under 18 from harmful content, to take appropriate measures to protect the general public from harmful and certain illegal content, and to introduce standards around advertising. I also draw the attention of the House to the report from the Secondary Legislation Scrutiny Committee considering this instrument, and I thank its members for their work.

I will now address the committee's concerns regarding jurisdiction. The AVMS directive sets out technical rules governing when a platform falls within a country's jurisdiction. First, there must be a physical presence, or a group undertaking, of the platform in the country. Where there is a physical presence in more than one country, jurisdiction is decided on the basis of factors such as whether the platform is established in that country, whether the platform's main economic activity is centred in that country, and the hierarchy of group undertakings as set out by the directive.

Under the revised AVMS directive, each EU member state and the UK is responsible for regulating only the video-sharing platforms that fall within its jurisdiction. There will be only one country that has jurisdiction for each platform at any one time. However, if a platform has no physical presence in any country covered by the AVMS directive, then no country will have jurisdiction over it, even if the platform provides services in those countries.

Through this instrument, we are seeking to maintain the same position for Ofcom's remit beyond the end of the transition period. This position allows Ofcom to regulate video-sharing platforms established in the UK and additionally regulate platforms that have a physical presence in the UK but not in any other country covered by the AVMS directive. Although Ofcom's remit will not be extended to include platforms established elsewhere in the EU, we believe UK users will indirectly benefit from the EU's regulation of platforms under the AVMS directive. The regulation under this regime is systems regulation, not content regulation. We therefore expect that as platforms based outside of the UK will set up and invest in systems to comply with the AVMS regulations, it is probable that these same systems will also be introduced for their UK subsidiaries.

In the absence of this instrument, Ofcom would no longer be able to regulate any video-sharing platforms. This would result in an unacceptable regulatory gap and a lack of protection for UK users using these services. Our approach also mitigates the small risk that a video- sharing platform offering services to countries covered by the AVMS directive, but not the UK, would establish itself in the UK in order to circumvent EU law.

While we recognise that most children have a positive experience online, the reality is that the impact of harmful content and activity online can be particularly damaging for children. Over three-quarters of UK adults also express a deep concern about the internet. The UK is one of only three countries to have transposed the revised directive thus far, evidencing our commitment to protecting users online.

These regulations also pave the way for the upcoming online harms regulatory regime. Given that the online harms regulatory framework shares broadly the same objectives as the video-sharing platform regime, it is the Government's intention that the regulation of video-sharing platforms in the UK will be superseded by the online harms legislation, once the latter comes into force. Further details on the plans for online harms regulation will be set out in the full government response to the consultation on the Online Harms White Paper, which is due to be published later this year, with draft legislation ready in early 2021. With that, I beg to move.

 

 

Harming hopes of a trade deal...

The Telegraph outlines the latest state of play in the government's upcoming internet censorship bill


Link Here 26th October 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
The Telegraph has reported on the current government thinking about its news internet censorship bill that it refers to as the Online Harms Bill.

Another update will be published after the US elections suggesting that the government's plans for internet censorship are abound up in negotiations for a US trade deal and the amount of scope for censorship will depend on whether Donald Trump or Joe Biden is in charge.

The Online Harms Bill is set to require websites and apps with user interaction to agree legally-binding terms and conditions that lock them into a rather vaguely define 'duty of care'.

Culture Secretary Oliver Dowden -- who has presented the plan to Number 10 with Home Secretary Priti Patel -- has pledged the firms' codes to tackle content such as self-harm and eating disorders will have to be meaningful and vetted by the new internet censor Ofcom to ensure they are proper and effective.

The current proposals are thought to stop short of criminal sanctions against the firms for breaches over legal but harmful content like self-harm videos, but named executives will be held accountable for companies' policies and face fines and disqualification for breaches. Criminal sanctions will be reserved for illegal online material such as child abuse and terrorism.

The proposals, set out as a response to the consultation on last year's white paper , are expected to be published after the US elections, once agreed by the Prime Minister.

The Government is expected to draft a tight duty of care bill early next year that will lay down the sanctions and investigative powers of the new regulator but leave the scope of the duty of care on legal harms to secondary legislation to be voted on by MPs.

 

 

Shared burdens...

Ofcom publishes its censorship guidelines to be applied to UK based video sharing platforms


Link Here21st October 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
Ofcom has published its burdensome censorship rules that will apply to video sharing platforms that are stupid enough to be based in the UK. In particular the rules are quite vague about age verification requirements for the two adult video sharing sites that remain in the UK. Maybe Ofcom is a bit shy about requiring onerous and unviable red tape of British companies trying to compete with large numbers of foreign companies that operate with a massive commercial advantage of not having age verification.

Ofcom do however note that these censorship rules are a stop gap until a wider scoped 'online harms' censorship regime which will start up in the next couple of years.

Ofcom writes:

Video-sharing platforms (VSPs) are a type of online video service which allows users to upload and share videos with members of the public.

From 1 November 2020, UK-established VSPs will be required to comply with new rules around protecting users from harmful content.

The main purpose of the new regulatory regime is to protect consumers who engage with VSPs from the risk of viewing harmful content. Providers must have appropriate measures in place to protect minors from content which might impair their physical, mental or moral development; and to protect the general public from criminal content and material likely to incite violence or hatred.

Ofcom has published a short guide outlining the new statutory requirements on providers. The guide is intended to assist platforms to determine whether they fall in scope of the new regime and to understand what providers need to do to ensure their services are compliant.

The guide also explains how Ofcom expects to approach its new duties in the period leading up to the publication of further guidance on the risk of harms and appropriate measures, which we will consult on in early 2021.

Ofcom will also be consulting on guidance on scope and jurisdiction later in 2020. VSP providers will be required to notify their services to Ofcom from 6 April 2021 and we expect to have the final guidance in place ahead of this time.

 

 

Offsite Article: New online harm legislation is a threat to free speech...


Link Here28th September 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
There is a problem online and it is causing real harm, but banning language rather than engaging in education sounds like a political fix rather than an actual solution. By Ruth Smeeth, former MP and CEO of Index on Censorship

See article from independent.co.uk

 

 

Offsite Article: Censorship is the greatest online harm...


Link Here 24th September 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
The UK government is planning a shocking clampdown on free speech online. By Radomir Tylecote

See article from spiked-online.com

 

 

Extract: Why is the government pushing unprecedented online censorship?...

Official plans are an authoritarian threat to our freedom of speech, and would prove a nasty surprise to most internet users. By Radomir Tylecote


Link Here18th September 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media

The UK Government's 'Online Harms' plans will lead to sweeping online censorship unprecedented in a democracy. Some of the harms the plans describe are vague, like unacceptable content and disinformation. The new regulations will prohibit material that may directly or indirectly cause harm even if it is not necessarily illegal.

In other words, the regulator will be empowered to censor lawful content, a huge infringement on our freedoms. The White Paper singled out offensive material, as if giving offense is a harm the public need protection from by the state. In fact, the White Paper does not properly define harm or hate speech, but empowers a future regulator to do so. Failure to define harm means the definition may be outsourced to the most vocal activists who see in the new regulator a chance to ban opinions they don't like.

The government claims its proposals are inspired by Germany's 2017 NetzDG law. But Human Rights Watch has said the law turns private companies into overzealous censors and called on Germany to scrap it. NetzDG's other fans include President Lukashenko of Belarus, who cited it to justify a 2017 clampdown on dissent. Vladimir Putin's United Russia Party cited NetzDG as the model for its internet law. So did Venezuela. Chillingly, the plans bear a striking similarity to some of Beijing's internet censorship policies. The Cyberspace Administration of China censors rumours because they cause social harms.

 

 

Newspaper censors...

The Daily Mail reports that the UK government intends to include newspaper websites in its proposed internet censorship regime


Link Here30th August 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
Up until now, the UK government has always indicated that newspaper websites would not be caught up in the new internet censorship regime proposed in the Government's Online Harms white paper.

However it now seems that the government has backtracked lest every websites claims to be a news service.

The Daily Mail reports that Julian Knight, chairman of the Commons Digital, Culture, Media and Sport Committee, has written to Culture Minister John Whittingdale over the proposed laws, after Home Office lawyers claimed that granting a publishers exemption' would create loopholes. One source close to the ministerial arguments over the proposed laws said:

Government lawyers are arguing that the publishers exemption would allow just anyone to claim it, so for instance you would have The Isis Times being able to distribute beheading videos.

The Tory MP Julian Knight told Whittingdale that Ministers in both DCMS and the Home Office should resolve the impasse by allowing an exemption for authenticated and reliable news sources.

The Government has yet to respond, amid concerns that any action may be delayed by wrangling over legislation to stop harmful online material and fears that antagonising powerful American-owned online platforms might jeopardise post-Brexit trade talks with the US.

 

 

Toxic culture...

Former culture secretary Jeremy Wright sets up parliamentary group to campaign for internet censorship


Link Here22nd August 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
Former culture secretary Jeremy Wright is setting up a parliamentary group (All party parliamentary group, APPG) to campaign for internet censorship

Wright, who drew up the Government's white paper proposing strict sanctions on tech platforms who fail to protect users under a duty of care is particularly calling for censorship powers to block, ban, fine or restrict apps and websites considered undesirable by the proposed internet censor, Ofcom. Wright said:

There needs to be a lot more clubs in the bag for the regulator than just fines, he said. I do think we need to consider criminal liability for individual (tech company) directors where it can be demonstrated.

He also felt the regulator should have powers of ISP blocking, which effectively bar an app from the UK, in cases of companies repeatedly and egregiously refusing to comply with censorship rules.  He said:

I do accept the chances of WhatsApp being turned off are remote. Although frankly, there may be circumstances where that may be the right thing to do and we shouldn't take it off the table.

Wright is founding the APPG alongside crossbench peer and children's digital rights campaigner Beeban Kidron, and the group has already attracted supporters, including three other former culture secretaries: Baroness Nicky Morgan, Karen Bradley and Maria Miller, as well as former Health and Foreign Secretary Jeremy Hunt.

 

 

Wrong think...

Labour demands the faster implementation of internet censorship


Link Here28th July 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
More censorship legislation is needed to protect people online after social media giants' failure to tackle hate speech on their websites, claims the Labour Party.

Jo Stevens, shadow secretary of state for digital, culture, media and sport, claimed the UK desperately needed legislation forcing platforms to act because self-regulation isn't working.

The Labour party is accusing the Government of delaying the introduction of an online harms bill to protect Internet users. It comes after politicians and campaigners condemned Twitter for being too slow to remove anti-Semitic tweets by rapper Wiley.

The Mayor of London Sadiq Khan said he has written to Instagram and Twitter to make it clear that they need to act immediately to remove social media posts that Labour does not like.

 

 

A Tale of Two Committees...

Another excellent blog post analysing current thinking as the government defines internet censorship to be introduced in the Online Harms Bill. By Graham Smith


Link Here 26th May 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media

 

 

 

Harmful censorship...

Ministers report on the timetable for the Online Harms internet censorship bill


Link Here14th May 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
The DCMS minister for censorship, Caroline Dinenage and the Home Office minister in the House of Lords, Susan Williams were quizzed by Parliament's home affairs committee on the progress of the Online Harms Bill.

Caroline Dinenage in particular gave the impression that the massive scope of the bill includes several issues that have not yet been fully thought through. The government does not yet seem able to provide a finalised timetable.

Dinenage told the home affairs committee that she could not commit to introducing the new laws in Parliament in the current session. She said it was an aspiration or intention rather than a commitment as pledged by her predecessor.

She said the government's final consultation response outlining its plans would not be published until probably in the Autumn, more than 18 months after the White Paper in 2019 and more than two and a half years since the green paper.

Julian Knight, Conservative chair of the culture committee, said:

If you don't do it it 2021, then it would have to go through the whole process and it could be 2023 before it is on the statute book with implementation in 2024. Given we have been working on this through the last Parliament, that is not good enough.

The disinformation online about coronavirus underlines why we need this legislation. Unless we can get the architecture in place, we will see further instances of serious erosion of public trust and even damage to the fabric of society.

Dinenage disclosed that the new internet censor, probably Ofcom, would initially be paid for by the taxpayer before shifting all funding to the tech industry.

 

 

Presumably wary of the possibility of bad press about internet censorship...

Ofcom director claims that Ofcom aren't out to censor the press


Link Here28th February 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media

 

 

 

Extract: Online harms harms trade negotiations...

Eye watering fines or jailing directors for not protecting kids from perceived online social media harms isn't sitting comfortably with negotiating a free trade deal with the US


Link Here 23rd February 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media

A Times article has popped up under the headline Boris Johnson set to water down curbs on tech giants.

It had all the hallmarks of an insider briefing, opening with the following

The prime minister is preparing to soften plans for sanctions on social media companies amid concerns about a backlash from tech giants.

There is a very pro-tech lobby in No 10, a well-placed source said. They got spooked by some of the coverage around online harms and raised concerns about the reaction of the technology companies. There is a real nervousness about it.

Read the full article from johncarr.blog

 

 

Offsite Article: This is state censorship of the internet...


Link Here 20th February 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
UK government plans to tackle online harms pose a huge threat to free speech. By Andrew Tettenborn

See article from spiked-online.com

 

 

New government internet censors...

Oliver Dowden takes over as the Culture Secretary, Julian Knight takes over the chair of the DCMS Select Committee and Ofcom is appointed as the AVMS internet censor


Link Here 16th February 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
Oliver Dowden was appointed Secretary of State for Digital, Culture, Media and Sport on 13 February 2020.

He was previously Paymaster General and Minister for the Cabinet Office, and before that, Parliamentary Secretary at the Cabinet Office. He was elected Conservative MP for Hertsmere in May 2015.

The previous Culture Secretary Nicky Morgan will now be spending more time with her family.

There's been no suggestions that Dowden will diverge from the government path on setting out a new internet censorship regime as outlined in its OnlIne Harms white paper.

Perhaps another parliamentary appointment that may be relevant is that Julian Knight has taken over the Chair of the DCMS Select Committee, the Parliamentary scrutiny body overseeing the DCMS.

Knight seems quite keen on the internet censorship idea and will surely be spurring on the DCMS.

And finally one more censorship appointment was announced by the Government. The government has appointed Ofcom to regulate video-sharing platforms under the audiovisual media services directive, which aims to reduce harmful content on these sites. That will provide quicker protection for some harms and activities and will act as a stepping stone to the full online harms regulatory framework.

 Matt Warman, The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport announced:

We also yesterday appointed Ofcom to regulate video-sharing platforms under the audiovisual media services directive, which aims to reduce harmful content on these sites. That will provide quicker protection for some harms and activities and will act as a stepping stone to the full online harms regulatory framework.

In Fact this censorship process is set to start in September 2020 and in fact Ofcom have already produced their solution that shadows the age verification requirements of the Digital Economy Act but now may need rethinking as some of the enforcement mechanisms, such as ISP blocking, are no longer on the table. The mechanism also only applies to British based online adult companies providing online video. of which there are hardly any left, after previously being destroyed by the ATVOD regime.

 

 

Offsite Article: The Porn Block Failed...


Link Here15th February 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
Now the Next Ofcom Censorship Bandwagon Begins. By Jerry Barnett

See article from sexandcensorship.org

 

 

The fundamental online harm is for British people to speak freely amongst themselves...

The Government will effectively ban British websites from having forums or comment sections by imposing onerous, vague and expensive censorship requirements on those that defiantly continue.


Link Here12th February 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
The Government has signalled its approach to introducing internet censorship in a government response to consultation contributions about the Online Harms white paper. A more detailed paper will follow in the spring.

The Government has outlined onerous, vague and expensive censorship requirements on any British website that lets its users post content including speech. Any website that takes down its forums and comment sections etc will escape the nastiness of the new law.

The idea seems to be to force all speech onto a few US and Chinese social media websites that can handle the extensive censorship requirements of the British Governments. No doubt this will give a market opportunity for the US and Chinese internet giants to start charging for forcibly moderated and censored interaction.

The Government has more or less committed to appointing Ofcom as the state internet censor who will be able to impose massive fines on companies and their fall guy directors who allow speech that the government doesn't like.

On a slightly more positive note the government seems to have narrowed down its censorship scope from any conceivable thing that could be considered a harm to someone somewhere into more manageable set that can be defines as harms to children.

The introductory sections of the document read:

Executive summary

1. The Online Harms White Paper set out the intention to improve protections for users online through the introduction of a new duty of care on companies and an independent regulator responsible for overseeing this framework. The White Paper proposed that this regulation follow a proportionate and risk-based approach, and that the duty of care be designed to ensure that all companies have appropriate systems and processes in place to react to concerns over harmful content and improve the safety of their users - from effective complaint mechanisms to transparent decision-making over actions taken in response to reports of harm.

2. The consultation ran from 8 April 2019 to 1 July 2019. It received over 2,400 responses ranging from companies in the technology industry including large tech giants and small and medium sized enterprises, academics, think tanks, children's charities, rights groups, publishers, governmental organisations and individuals. In parallel to the consultation process, we have undertaken extensive engagement over the last 12 months with representatives from industry, civil society and others. This engagement is reflected in the response.

3. This initial government response provides an overview of the consultation responses and wider engagement on the proposals in the White Paper. It includes an in-depth breakdown of the responses to each of the 18 consultation questions asked in relation to the White Paper proposals, and an overview of the feedback in response to our engagement with stakeholders. This document forms an iterative part of the policy development process. We are committed to taking a deliberative and open approach to ensure that we get the detail of this complex and novel policy right. While it does not provide a detailed update on all policy proposals, it does give an indication of our direction of travel in a number of key areas raised as overarching concern across some responses.

4. In particular, while the risk-based and proportionate approach proposed by the White Paper was positively received by those we consulted with, written responses and our engagement highlighted questions over a number of areas, including freedom of expression and the businesses in scope of the duty of care. Having carefully considered the information gained during this process, we have made a number of developments to our policies. These are clarified in the 'Our Response' section below.

5. This consultation has been a critical part of the development of this policy and we are grateful to those who took part. This feedback is being factored into the development of this policy, and we will continue to engage with users, industry and civil society as we continue to refine our policies ahead of publication of the full policy response. We believe that an agile and proportionate approach to regulation, developed in collaboration with stakeholders, will strengthen a free and open internet by providing a framework that builds public trust, while encouraging innovation and providing confidence to investors.

Our response Freedom of expression

1. The consultation responses indicated that some respondents were concerned that the proposals could impact freedom of expression online. We recognise the critical importance of freedom of expression, both as a fundamental right in itself and as an essential enabler of the full range of other human rights protected by UK and international law. As a result, the overarching principle of the regulation of online harms is to protect users' rights online, including the rights of children and freedom of expression. Safeguards for freedom of expression have been built in throughout the framework. Rather than requiring the removal of specific pieces of legal content, regulation will focus on the wider systems and processes that platforms have in place to deal with online harms, while maintaining a proportionate and risk-based approach.

2. To ensure protections for freedom of expression, regulation will establish differentiated expectations on companies for illegal content and activity, versus conduct that is not illegal but has the potential to cause harm. Regulation will therefore not force companies to remove specific pieces of legal content. The new regulatory framework will instead require companies, where relevant, to explicitly state what content and behaviour they deem to be acceptable on their sites and enforce this consistently and transparently. All companies in scope will need to ensure a higher level of protection for children, and take reasonable steps to protect them from inappropriate or harmful content.

3. Services in scope of the regulation will need to ensure that illegal content is removed expeditiously and that the risk of it appearing is minimised by effective systems. Reflecting the threat to national security and the physical safety of children, companies will be required to take particularly robust action to tackle terrorist content and online child sexual exploitation and abuse.

4. Recognising concerns about freedom of expression, the regulator will not investigate or adjudicate on individual complaints. Companies will be able to decide what type of legal content or behaviour is acceptable on their services, but must take reasonable steps to protect children from harm. They will need to set this out in clear and accessible terms and conditions and enforce these effectively, consistently and transparently. The proposed approach will improve transparency for users about which content is and is not acceptable on different platforms, and will enhance users' ability to challenge removal of content where this occurs.

5. Companies will be required to have effective and proportionate user redress mechanisms which will enable users to report harmful content and to challenge content takedown where necessary. This will give users clearer, more effective and more accessible avenues to question content takedown, which is an important safeguard for the right to freedom of expression. These processes will need to be transparent, in line with terms and conditions, and consistently applied.

Ensuring clarity for businesses

6. We recognise the need for businesses to have certainty, and will ensure that guidance is provided to help businesses understand potential risks arising from different types of service, and the actions that businesses would need to take to comply with the duty of care as a result. We will ensure that the regulator consults with relevant stakeholders to ensure the guidance is clear and practicable.

Businesses in scope

7. The legislation will only apply to companies that provide services or use functionality on their websites which facilitate the sharing of user generated content or user interactions, for example through comments, forums or video sharing. Our assessment is that only a very small proportion of UK businesses (estimated to account to less than 5%) fit within that definition. To ensure clarity, guidance will be provided by the regulator to help businesses understand whether or not the services they provide or functionality contained on their website would fall into the scope of the regulation.

8. Just because a business has a social media page that does not bring it in scope of regulation. Equally, a business would not be brought in scope purely by providing referral or discount codes on its website to be shared with other potential customers on social media. It would be the social media platform hosting the content that is in scope, not the business using its services to advertise or promote their company. To be in scope, a business would have to operate its own website with the functionality to enable sharing of user-generated content, or user interactions. We will introduce this legislation proportionately, minimising the regulatory burden on small businesses. Most small businesses where there is a lower risk of harm occurring will not have to make disproportionately burdensome changes to their service to be compliant with the proposed regulation.

9. Regulation must be proportionate and based on evidence of risk of harm and what can feasibly be expected of companies. We anticipate that the regulator would assess the business impacts of any new requirements it introduces. Final policy positions on proportionality will, therefore, align with the evidence of risk of harm and impact to business. Business-to-business services have very limited opportunities to prevent harm occurring to individuals and as such will be out of scope of regulation.

Identity of the regulator

11. We are minded to make Ofcom the new regulator, in preference to giving this function to a new body or to another existing organisation. This preference is based on its organisational experience, robustness, and experience of delivering challenging, high-profile remits across a range of sectors. Ofcom is a well-established and experienced regulator, recently assuming high profile roles such as regulation of the BBC. Ofcom's focus on the communications sector means it already has relationships with many of the major players in the online arena, and its spectrum licensing duties mean that it is practised at dealing with large numbers of small businesses.

12. We judge that such a role is best served by an existing regulator with a proven track record of experience, expertise and credibility. We think that the best fit for this role is Ofcom, both in terms of policy alignment and organisational experience - for instance, in their existing work, Ofcom already takes the risk-based approach that we expect the online harms regulator will need to employ.

Transparency

13. Effective transparency reporting will help ensure that content removal is well-founded and freedom of expression is protected. In particular, increasing transparency around the reasons behind, and prevalence of, content removal may address concerns about some companies' existing processes for removing content. Companies' existing processes have in some cases been criticised for being opaque and hard to challenge.

14. The government is committed to ensuring that conversations about this policy are ongoing, and that stakeholders are being engaged to mitigate concerns. In order to achieve this, we have recently established a multi-stakeholder Transparency Working Group chaired by the Minister for Digital and Broadband which includes representation from all sides of the debate, including from industry and civil society. This group will feed into the government's transparency report, which was announced in the Online Harms White Paper and which we intend to publish in the coming months.

15. Some stakeholders expressed concerns about a potential 'one size fits all' approach to transparency, and the material costs for companies associated with reporting. In line with the overarching principles of the regulatory framework, the reporting requirements that a company may have to comply with will also vary in proportion with the type of service that is being provided, and the risk factors involved. To maintain a proportionate and risk-based approach, the regulator will apply minimum thresholds in determining the level of detail that an in-scope business would need to provide in its transparency reporting, or whether it would need to produce reports at all.

Ensuring that the regulator acts proportionately

16. The consideration of freedom of expression is at the heart of our policy development, and we will ensure that appropriate safeguards are included throughout the legislation. By taking action to address harmful online behaviours, we are confident that our approach will support more people to enjoy their right to freedom of expression and participate in online discussions.

17. At the same time, we also remain confident that proposals will not place an undue burden on business. Companies will be expected to take reasonable and proportionate steps to protect users. This will vary according to the organisation's associated risk, first and foremost, size and the resources available to it, as well as by the risk associated with the service provided. To ensure clarity about how the duty of care could be fulfilled, we will ensure there is sufficient clarity in the regulation and codes of practice about the applicable expectations on business, including where businesses are exempt from certain requirements due to their size or risk.

18. This will help companies to comply with the legislation, and to feel confident that they have done so appropriately.

Enforcement

19. We recognise the importance of the regulator having a range of enforcement powers that it uses in a fair, proportionate and transparent way. It is equally essential that company executives are sufficiently incentivised to take online safety seriously and that the regulator can take action when they fail to do so. We are considering the responses to the consultation on senior management liability and business disruption measures and will set out our final policy position in the Spring.

Protection of children

20. Under our proposals we expect companies to use a proportionate range of tools including age assurance, and age verification technologies to prevent children from accessing age-inappropriate content and to protect them from other harms. This would achieve our objective of protecting children from online pornography, and would also fulfil the aims of the Digital Economy Act.

 

 

I don't believe the government's new internet harm vaccine will work!...

The UK government has been briefing the press about its upcoming internet censorship bill


Link Here6th February 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
The U.K. government has hinted at its thoughts on its internet censorship plans and has also be giving clues about the schedule.

A first announcement seems to be due this month. It seems that the government is planning a summer bill and implementation within about 18 months.

The plans are set to be discussed in Cabinet on Thursday and are due to be launched to coincide with Safer Internet Day next Tuesday when Baroness Morgan will also publish results of a consultation on last year's White Paper on online harms.

The unelected Nicky Morgan proposes the new regime should mirror regulation in the financial sector, known as senior management liability where firms have to appoint a fall guy director to take personal responsibility for ensuring they meet their legal duties. They face fines and criminal prosecution for breaches.

Ofcom will advise on potential sanctions against the directors ranging from enforcement notices, professional disqualification, fines and criminal prosecution. Under the plans, Ofcom will also draw up legally enforceable codes of practice setting out what the social media firms will be expected to do to protect users from loosely define online harms that may not even be illegal. 

Other legal harms to be covered by codes are expected to include disinformation that causes public harm such as anti-vaccine propaganda, self-harm, harassment, cyberbullying, violence and pornography where there will be tougher rules on age verification to bar children.

Tellingly proposals to include real and actual financial harms such as fraud in the codes have been dropped.

Ministers have yet to decide if to give the internet censor the power to block website access to UK internet users but this option seems out of favour, maybe because it results in massive numbers of people moving to the encrypted internet that makes it harder the authorities to snoop on people's internet activity.

 

 

Offsite Article: Harmful government...


Link Here26th January 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
Internet regulation is necessary but an overzealous Online Harms bill could harm our rights. By Michael Drury and Julian Hayes

See article from euronews.com

 

 

Harmful haste when previous law failures suggest more careful consideration is required...

Sky boss supports a bill just launched in the House of Lords to hasten the appointment of Ofcom as the UK's internet censor


Link Here17th January 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
Sky's chief executive Jeremy Darroch has urged the Government to speed up its plans for an online censor as a bill to appoint Ofcom to the job was introduced in the House of Lords on Tuesday.

Darroch has written to all MPs asking for their support in establishing an internet censor that that will tackle supposed online harms.

Darroch's beef seems to be what he described as the prolific spread of misinformation, online abuse and fake news in last month's general election. He claimed it had shown the damage that unregulated online platforms are doing to our society.

A DCMS spokesperson declined to say how soon it may be before a draft bill is introduced, but Culture Secretary Nicky Morgan pledged in a speech yesterday to develop a media literacy strategy to be published in the summer, which is expected to come before the online harms legislation.

Johnson plans to precede the online harms bill with interim codes of practice ordering tech companies to clamp down on use of their platforms by terrorists and those engaged in child sexual abuse and exploitation.

Tom McNally's Online Harms Reduction Regulator (Report) Bill started its journey in the House of Lords on Tuesday. He has said he prepared the bill to keep up a momentum I fear may be lost and to provide a platform for wider public debate. Th bill appoints Ofcoms as the UK's internet censors and tasks it with preparing for the introduction of a statutory duty of care obligation for online platforms. Ofcom would have to prepare a report with recommendations on how this should be done, and the Government would be forced to produce its draft bill within a year from the publication of this report.
 

 

 

Offsite Article: Britain's Digital Nanny State...


Link Here 7th January 2020
Full story: Online Harms White Paper...UK Government seeks to censor social media
The way in which the UK is approaching the regulation of social media will undermine privacy and freedom of expression and have a chilling effect on Internet use by everyone in Britain. By Bill Dutton

See article from billdutton.me

 

 

The Fall Guy...

The Government's Online Harms bill will require foreign social media companies to appoint a token fall guy in Britain who will be jailed should the company fail in its duty of care. I wonder what the salary will be?


Link Here31st December 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
The government is pushing forward with an internet censorship bill which will punish people and companies for getting it wrong without the expense and trouble of tying to dictate rules on what is allowed.

In an interesting development the Times is reporting that the government want to introduce a "senior management liability", under which executives could be held personally responsible for breaches of standards. US tech giants would be required to appoint a British-based director, who would be accountable for any breaches of the censorship rules.

It seems a little unjust to prosecute a token fall guy who is likely to have absolutely no say in the day to day decisions made by a foreign company. Still it should be a very well paid job which hopefully includes lots of coverage for legal bills and a zero notice period allowing instant resignation at the first hint of trouble.

 

 

Queen's speech: the government seeks to dictate everybody's speech...

Britain's new government continues with its internet censorship policy outlined in the Online Harms white paper


Link Here19th December 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
The Government has reiterated its plans as outlined in the Online Harms white paper. It seems that the measures have been advanced somewhat as previous references to pre-legislative scrutiny have been deleted.

The Queens Speech briefing paper details the government's legislative plan for the next two years:

Online Harms

“My ministers will develop legislation to improve internet safety for all.”

  • Britain is leading the world in developing a comprehensive regulatory regime to keep people safe online, protect children and other vulnerable users and ensure that there are no safe spaces for terrorists online.

  • The April 2019 Online Harms White Paper set out the Government’s plan for world-leading legislation to make the UK the safest place in the world to be online. The Government will continue work to develop this legislation, alongside ensuring that the UK remains one of the best places in the world for technology companies to operate.

  • The proposals, as set out in the White Paper were:

    ○ A new duty of care on companies towards their users, with an independent regulator to oversee this framework.
    ○ The Government want to keep people safe online, but we want to do this in a proportionate way, ensuring that freedom of expression is upheld and promoted online, and that the value of a free and ndependent press is preserved.
    ○ The Government is seeking to do this by ensuring that companies have the right processes and systems in place to fulfil their obligations,
    rather than penalising them for individual instances of unacceptable content.
     

  • The public consultation on this has closed and the Government is analysing the responses and considering the issues raised. The Government is working
    closely with a variety of stakeholders, including technology companies and civil society groups, to understand their views.

  • The Government will prepare legislation to implement the final policy in response to the consultation.

  • Ahead of this legislation, the Government will publish interim codes of practice on tackling the use of the internet by terrorists and those engaged in child
    sexual abuse and exploitation. This will ensure companies take action now to tackle content that threatens our national security and the physical safety of
    children.

  • The Government will publish a media literacy strategy to empower users to stay safe online.

  • The Government will help start-ups and businesses to embed safety from the earliest stages of developing or updating their products and services, by publishing a Safety by Design framework.

  • The Government will carry out a review of the Gambling Act, with a particular focus on tackling issues around online loot boxes and credit card misuse.

  • What the Government has done so far:
     

    • The joint DCMS-Home Office Online Harms White Paper was published in April 2019. The Government also published the Social Media Code of Practice, setting out the actions that social media platforms should take to prevent bullying, insulting, intimidating and humiliating behaviours on their sites.

    • In November 2018 the Government established a new UK Council for Internet Safety. This expanded the scope of the UK Council for Child Internet Safety, and was guided by the Government's Internet Safety Strategy.

    • The UK has been championing international action on online safety. The Prime Minister used his speech at the United Nations General Assembly to champion the UK's work on online safety.

 

 

Offsite Article: UK shows how not to regulate tech...


Link Here 12th November 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Nobody said it was easy. But it shouldn't be this hard either. By Amol Rajan BBC Media editor

See article from bbc.com

 

 

Maybe Google and its AI can do it cheaper...

Ofcom sets out its stall in a report finding that internet censorship, as per the Online Harms Bill, will be very vague, very open to unintended consequences, and presumably very expensive


Link Here 28th October 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Ofcom writes:

We have today published an economic perspective on the challenges and opportunities in regulating online services.

Online services have revolutionised people's personal and working lives, generating significant benefits. But some of their features have the potential to cause harms to individuals and society. These can include exposure to harmful content or conduct, loss of privacy, data or security breaches, lack of competition, unfair business practices or harm to wellbeing. In May, our Online Nation report set out the benefits to consumers of being online and their concerns about potential online harm.

Today's paper aims to contribute to the discussion on how to address these harms effectively, drawing on Ofcom's experience as the UK communications regulator. It looks at the sources of online harms from an economic perspective, which can inform the broader policy assessment that policymakers and regulators may use to evidence and address them.

 

 

My Ministers will continue to develop proposals to extend internet censorship...

A summary of the Online Harms Bill as referenced in the Queen's Speech


Link Here15th October 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media

The April 2019 Online Harms White Paper set out the Government's plan for world-leading legislation to make the UK the safest place in the world to be online.

The proposals, as set out in the White Paper were:

  • A new duty of care on companies towards their users, with an independent regulator to oversee this framework.

  • We want to keep people safe online, but we want to do this in a proportionate way, ensuring that freedom of expression is upheld and promoted online, and businesses do not face undue burdens.

  • We are seeking to do this by ensuring that companies have the right processes and systems in place to fulfil their obligations, rather than penalising them for individual instances of unacceptable content.

  • Our public consultation on this has closed and we are analysing the responses and considering the issues raised. We are working closely with a variety of stakeholders, including technology companies and civil society groups, to understand their views.

We are seeking to do this by ensuring that companies have the right processes and systems in place to fulfil their obligations, rather than penalising them for individual instances of unacceptable content.

Our public consultation on this has closed and we are analysing the responses and considering the issues raised. We are working closely with a variety of stakeholders, including technology companies and civil society groups, to understand their views.

Next steps:

  • We will publish draft legislation for pre-legislative scrutiny.

  • Ahead of this legislation, the Government will publish work on tackling the use of the internet by terrorists and those engaged in child sexual abuse and exploitation, to ensure companies take action now to tackle content that threatens our national security and the physical safety of children.

  • We are also taking forward additional measures, including a media literacy strategy, to empower users to stay safe online. A Safety by Design framework will help start-ups and small businesses to embed safety during the development or update of their products and services.

 

 

Culture of censorship...

Culture Secretary makes a speech about censoring our internet along the lines of TV


Link Here19th September 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Culture Secretary Nicky Morgan's made the keynote address to the Royal Television Society at the University of Cambridge. She took the opportunity to announce that the government is considering how to censor internet more in line with strict TV censorship laws.

She set the background noting how toxic the internet has become. Politicians never seem to consider that a toxic response to politicians may be totally justified by the dreadful legislation being passed to marginalise, disempower and impoverish British people. She noted:

And this Government is determined to see a strong and successful future for our public service broadcasters and commercial broadcasters alike.

I really value the important contribution that they all make to our public life, at a time when our civil discourse is increasingly under strain.

Disinformation, fuelled by hermetically sealed online echo chambers, is threatening the foundations of truth that we all rely on.

And the tenor of public conversations, especially those on social media, has become increasingly toxic and hostile.

Later she spoke of work in progress to move censor the internet along the lines of TV. She said:

The second area where we need to adapt is the support offered by the Government and regulators.

We need to make sure that regulations, some of which were developed in the analogue age, are fit for the new ways that people create and consume content.

While I welcome the growing role of video on demand services and the investment and consumer choice they bring, it is important that we have regulatory frameworks that reflect this new environment.

For example, whereas a programme airing on linear TV is subject to Ofcom's Broadcasting Code, and the audience protections it contains, a programme going out on most video on demand services is not subject to the same standards.

This does not provide the clarity and consistency that consumers would expect.

So I am interested in considering how regulation should change to reflect a changing sector.

 

 

A Bully's Charter...

MPs and campaigners call for 'misogyny' to be defined as on 'online harm' requiring censorship by social media. What could go wrong?


Link Here7th September 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media

MPs and activists have urged the government to protect women through censorship. They write in a letter

Women around the world are 27 times more likely to be harassed online than men. In Europe, 9 million girls have experienced some kind of online violence by the time they are 15 years old. In the UK, 21% of women have received threats of physical or sexual violence online. The basis of this abuse is often, though not exclusively, misogyny.

Misogyny online fuels misogyny offline. Abusive comments online can lead to violent behaviour in real life. Nearly a third of respondents to a Women's Aid survey said where threats had been made online from a partner or ex-partner, they were carried out. Along with physical abuse, misogyny online has a psychological impact. Half of girls aged 11-21 feel less able to share their views due to fear of online abuse, according to Girlguiding UK .

The government wants to make Britain the safest place in the world to be online, yet in the online harms white paper, abuse towards women online is categorised as harassment, with no clear consequences, whereas similar abuse on the grounds of race, religion or sexuality would trigger legal protections.

If we are to eradicate online harms, far greater emphasis in the government's efforts should be directed to the protection and empowerment of the internet's single largest victim group: women. That is why we back the campaign group Empower's calls for the forthcoming codes of practice to include and address the issue of misogyny by name, in the same way as they would address the issue of racism by name. Violence against women and girls online is not harassment. Violence against women and girls online is violence.

Ali Harris Chief executive, Equally Ours
Angela Smith MP Independent
Anne Novis Activist
Lorely Burt Liberal Democrat, House of Lords
Ruth Lister Labour, House of Lords
Barry Sheerman MP Labour
Caroline Lucas MP Green
Daniel Zeichner MP Labour
Darren Jones MP Labour
Diana Johnson MP Labour
Flo Clucas Chair, Liberal Democrat Women
Gay Collins Ambassador, 30% Club
Hannah Swirsky Campaigns officer, René Cassin
Joan Ryan MP Independent Group for Change
Joe Levenson Director of communications and campaigns, Young Women's Trust
Jonathan Harris House of Lords, Labour
Luciana Berger MP Liberal Democrats
Mandu Reid Leader, Women's Equality Party
Maya Fryer WebRoots Democracy
Preet Gill MP Labour
Sarah Mann Director, Friends, Families and Travellers
Siobhan Freegard Founder, Channel Mum
Jacqui Smith Empower

Offsite Patreon Comment: What will go wrong?

See subscription article from patreon.com

 

 

A new blasphemy law...

The National Secular Society Government online harm plans could curb free expression


Link Here6th July 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media

The National Secular Society has warned that government plans to require social media companies to censor hateful and offensive content could act as a de facto blasphemy law.

In its response to the government's white paper on online harms , the NSS said efforts to confront and challenge hateful speech and behaviour must not undermine free speech on religion.

The white paper outlines plans to create a regulator with the power to fine online platforms and block websites. The regulator will be required to create guidance for social media companies, outlining what constitutes hateful content online.

The guidance would include content which is not necessarily illegal, content which may directly or indirectly cause harm to other users and some offensive material in that definition.

The NSS said censoring content that could be considered offensive would severely restrict freedom of expression, including the freedom to criticise or satirise religion. The society added that the question of offence was an entirely subjective matter.

The NSS also noted that a requirement on companies to demonstrate 'continuous improvement' in tackling hateful material could encourage them to be more censorious.

The NSS also challenged a claim in the white paper that offending online is just as serious as that occurring offline. The NSS said this line lowered the threshold for hate crimes, because people's ability to commit such crimes is much more limited online than offline.

The society raised the example of a man who was recently arrested on suspicion of hate crime after publishing a video on Facebook of himself mocking Islamic prayer in a hospital prayer room. The NSS noted that threats of death and violence were made towards the man and were reported to the police, but no action appeared to have been taken against the perpetrators to date.

The NSS also criticised the government's definition of hate crime which is contained within the white paper. The definition says hate crimes include crimes demonstrating hostility on the grounds of an individual's actual or perceived race, religion, sexual orientation, disability or transgender identity.

The NSS said this definition was too broad, meaning any incident in which an individual demonstrates hostility toward another individual based on the listed characteristics could be treated as a hate crime.

The society said strong critics of religion or Christians who preach that gay people will go to Hell were examples of those who risk being charged with hate crimes.

NSS spokesperson Megan Manson said the white paper had given too much ground to those who attempt to shut down legitimate expression, including on religion.

The government should treat the fundamental right to free expression as a positive value in its attempts to promote social cohesion. Instead it has proposed cracking down on what people can say on social media, based largely on vague and broad definitions of what constitutes 'hateful' material. In the process it risks significantly undermining free expression for all and stirring social resentment.

Ministers must not treat the civil liberties of British citizens as an afterthought in their efforts to tackle online harms.

 

 

Offsite Article: How about exemptions from Government censorship for ALL internet forums?...


Link Here3rd July 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Press Gazette apposes the government's internet censorship proposals in the Online Harms white paper but only calls for the exemptions for themselves

See article from pressgazette.co.uk

 

 

Offsite Article: The UK Is Trying to Censor the Internet...


Link Here 2nd July 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Here's what the Online Harms White Paper means for UK internet users

See article from pcmag.com

 

 

Government consultation on its internet censorship plans...

Monday is the last day to respond and the Open Rights Group makes some suggestions


Link Here30th June 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
The Government is accepting public feedback on their plan until Monday 1 July. Send a message to their consultation using Open Rights Group  tool before the end of Monday!

The Open Rights Group comments on the government censorship plans:

Online Harms: Blocking websites doesn't work -- use a rights-based approach instead

Blocking websites isn't working. It's not keeping children safe and it's stopping vulnerable people from accessing information they need. It's not the right approach to take on Online Harms.

This is the finding from our recent research into website blocking by mobile and broadband Internet providers. And yet, as part of its Internet regulation agenda, the UK Government wants to roll out even more blocking.

The Government's Online Harms White Paper is focused on making online companies fulfil a "duty of care" to protect users from "harmful content" -- two terms that remain troublingly ill-defined. 1

The paper proposes giving a regulator various punitive measures to use against companies that fail to fulfil this duty, including powers to block websites.

If this scheme comes into effect, it could lead to widespread automated blocking of legal content for people in the UK.

Mobile and broadband Internet providers have been blocking websites with parental control filters for five years. But through our Blocked project -- which detects incorrect website blocking -- we know that systems are still blocking far too many sites and far too many types of sites by mistake.

Thanks to website blocking, vulnerable people and under-18s are losing access to crucial information and support from websites including counselling, charity, school, and sexual health websites. Small businesses are losing customers. And website owners often don't know this is happening.

We've seen with parental control filters that blocking websites doesn't have the intended outcomes. It restricts access to legal, useful, and sometimes crucial information. It also does nothing to prevent people who are determined to get access to material on blocked websites, who often use VPNs to get around the filters. Other solutions like filters applied by a parent to a child's account on a device are more appropriate.

Unfortunately, instead of noting these problems inherent to website blocking by Internet providers and rolling back, the Government is pressing ahead with website blocking in other areas.

Blocking by Internet providers may not work for long. We are seeing a technical shift towards encrypted website address requests that will make this kind of website blocking by Internet providers much more difficult.

When I type a human-friendly web address such as openrightsgroup.org into a web browser and hit enter, my computer asks a Domain Name System (DNS) for that website's computer-friendly IP address - which will look something like 46.43.36.233 . My web browser can then use that computer-friendly address to load the website.

At the moment, most DNS requests are unencrypted. This allows mobile and broadband Internet providers to see which website I want to visit. If a website is on a blocklist, the system won't return the actual IP address to my computer. Instead, it will tell me that that site is blocked, or will tell my computer that the site doesn't exist. That stops me visiting the website and makes the block effective.

Increasingly, though, DNS requests are being encrypted. This provides much greater security for ordinary Internet users. It also makes website blocking by Internet providers incredibly difficult. Encrypted DNS is becoming widely available through Google's Android devices, on Mozilla's Firefox web browser and through Cloudflare's mobile application for Android and iOS. Other encrypted DNS services are also available.

Our report DNS Security - Getting it Right discusses issues around encrypted DNS in more detail.

Blocking websites may be the Government's preferred tool to deal with social problems on the Internet but it doesn't work, both in policy terms and increasingly at a technical level as well.

The Government must accept that website blocking by mobile and broadband Internet providers is not the answer. They should concentrate instead on a rights-based approach to Internet regulation and on educational and social approaches that address the roots of complex societal issues.

Offsite Article: CyberLegal response to the Online Harms Consultation

30th June 2019. See article from cyberleagle.com

Speech is not a tripping hazard

 

 

UK Internet Regulation Part II...

Open Rights Group reports on how the Online Harms Bill will harm free speech, justice and liberty


Link Here 18th June 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media

This report follows our research into current Internet content regulation efforts, which found a lack of accountable, balanced and independent procedures governing content removal, both formally and informally by the state.

There is a legacy of Internet regulation in the UK that does not comply with due process, fairness and fundamental rights requirements. This includes: bulk domain suspensions by Nominet at police request without prior authorisation; the lack of an independent legal authorisation process for Internet Watch Foundation (IWF) blocking at Internet Service Providers (ISPs) and in the future by the British Board of Film Classification (BBFC), as well as for Counter-Terrorism Internet Referral Unit (CTIRU) notifications to platforms of illegal content for takedown. These were detailed in our previous report.

The UK government now proposes new controls on Internet content, claiming that it wants to ensure the same rules online as offline. It says it wants harmful content removed, while respecting human rights and protecting free expression.

Yet proposals in the DCMS/Home Office White Paper on Online Harms will create incentives for Internet platforms such as Google, Twitter and Facebook to remove content without legal processes. This is not the same rules online as offline. It instead implies a privatisation of justice online, with the assumption that corporate policing must replace public justice for reasons of convenience. This goes against the advice of human rights standards that government has itself agreed to and against the advice of UN Special Rapporteurs.

The government as yet has not proposed any means to define the harms it seeks to address, nor identified any objective evidence base to show what in fact needs to be addressed. It instead merely states that various harms exist in society. The harms it lists are often vague and general. The types of content specified may be harmful in certain circumstances, but even with an assumption that some content is genuinely harmful, there remains no attempt to show how any restriction on that content might work in law. Instead, it appears that platforms will be expected to remove swathes of legal-but-unwanted content, with as as-yet-unidentified regulator given a broad duty to decide if a risk of harm exists. Legal action would follow non-compliance by a platform. The result is the state proposing censorship and sanctions for actors publishing material that it is legal to publish.

 

 

Offsite Article: Christian Concerns...


Link Here15th June 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Who'd have thought that a Christian Campaign Group would be calling on its members to criticise the government's internet censorship bill in a consultation

See article from christianconcern.com

 

 

Updated: Tech companies criticise the government's Online Harms white paper...

The harms will be that British tech businesses will be destroyed so that politicians can look good for 'protecting the children'


Link Here2nd June 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
A scathing new report, seen by City A.M. and authored by the Internet Association (IA), which represents online firms including Google, Facebook and Twitter, has outlined a string of major concerns with plans laid out in the government Online Harms white paper last month.

The Online Harms white paper outlines a large number of internet censorship proposals hiding under the vague terminology of 'duties of care'.

Under the proposals, social media sites could face hefty fines or even a ban if they fail to tackle online harms such as inappropriate age content, insults, harassment, terrorist content and of course 'fake news'.

But the IA has branded the measures unclear and warned they could damage the UK's booming tech sector, with smaller businesses disproportionately affected.  IA executive director Daniel Dyball said:

Internet companies share the ambition to make the UK one of the safest places in the world to be online, but in its current form the online harms white paper will not deliver that, said

The proposals present real risks and challenges to the thriving British tech sector, and will not solve the problems identified.

The IA slammed the white paper over its use of the term duty of care, which it said would create legal uncertainty and be unmanageable in practice.

The lobby group also called for a more precise definition of which online services would be covered by regulation and greater clarity over what constitutes an online harm. In addition, the IA said the proposed measures could raise serious unintended consequences for freedom of expression.

And while most internet users favour tighter rules in some areas, particularly social media, people also recognise the importance of protecting free speech 203 which is one of the internet's great strengths.

Update: Main points

2nd June 2019. See article from uk.internetassociation.org

The Internet Association paper sets out five key concerns held by internet companies:

  • "Duty of Care" has a specific legal meaning that does not align with the obligations proposed in the White Paper, creating legal uncertainty, and would be unmanageable;
  • The scope of the services covered by regulation needs to be defined differently, and more closely related to the harms to be addressed;
  • The category of "harms with a less clear definition" raises significant questions and concerns about clarity and democratic process;
  • The proposed code of practice obligations raise potentially dangerous unintended consequences for freedom of expression;
  • The proposed measures will damage the UK digital sector, especially start-ups, micro-businesses and small- and medium-sized enterprises (SMEs), and slow innovation.

 

 

Offsite Article: Careless lawmaking...


Link Here6th May 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Detailed legal analysis of Online Harms white paper does not impress

See article from cyberleagle.com

 

 

Extract: Lords of Censorship...

Lords debate about Online Harms sees peers line up as supporters of internet censorship and each adds their own little pet suggestions for even more censorship


Link Here1st May 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
The House of Lords saw a pre-legislation debate about the governments Online Harms white paper. Peers from all parties queued up to add their praise for internet censorship. And don't even think that maybe the LibDems may be a little more appreciative of free speech and a little less in favour of state censorship. Don't dream! all the lords that spoke were gagging for it...censorship that is.

And support for the internet censorship in the white paper wasn't enough. Many of the speakers presumed to add on their own pet ideas for even more censorship.

I did spot one piece of information that was new to me. It seems that the IWF have extended their remit to include cartoon child porn as material they work against.

Elspeth Howe said during the debate:

I am very pleased that, since the debates at the end of last year, the Internet Watch Foundation has adopted a new non-photographic images policy and URL block list, so that websites that contain these images can be blocked by IWF members. It allows for network blocking of non-photographic images to be applied to filtering solutions, and it can prevent pages containing non-photographic images being shown in online search engine results. In 2017, 3,471 reports of alleged non-photographic images of child sexual abuse were made to the IWF; the figure for 2018 was double that, at 7,091 alleged reports. The new IWF policy was introduced only in February, so it is early days to see whether this will be a success. The IWF is unable to remove content unless that content originates in the UK, which of course is rare. The IWF offers this list on a voluntary basis, not a statutory basis as would occur under the Digital Economy Act. Can the Minister please keep the House informed about the success of the new policy and, if necessary, address the loopholes in the legislative proposal arising from this White Paper?

Anyway read the full debate from hansard.parliament.uk

 

 

Offsite Article: User's Behaving Badly...


Link Here20th April 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
An interesting look at the government's Online Harms white paper proposing extensive internet censorship for the UK

See article from cyberleagle.com

 

 

More like China, Russia or North Korea...

Tory MPs line up to criticise their own government's totalitarian-style internet censorship proposals


Link Here 14th April 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media

Ministers are facing a growing and deserved backlash against draconian new web laws which will lead to totalitarian-style censorship.

The stated aim of the Online Harms White Paper is to target offensive material such as terrorists' beheading videos. But under the document's provisions, the UK internet censor would have complete discretion to decide what is harmful, hateful or bullying -- potentially including coverage of contentious issues such as transgender rights.

After MPs lined up to demand a rethink, Downing Street has put pressure on Culture Secretary Jeremy Wright to narrow the definition of harm in order to exclude typical editorial content.

MPs have been led by Jacob Rees-Mogg, who said last night that while it was obviously a worthwhile aim to rid the web of the evils of terrorist propaganda and child pornography, it should not be at the expense of crippling a free Press and gagging healthy public expression. He added that the regulator could be used as a tool of repression by a future Jeremy Corbyn-led government, saying:

Sadly, the Online Harms White Paper appears to give the Home Secretary of the day the power to decide the rules as to which content is considered palatable. Who is to say that less scrupulous governments in the future would not abuse this new power?

I fear this could have the unintended consequence of reputable newspaper websites being subjected to quasi-state control. British newspapers freedom to hold authority to account is an essential bulwark of our democracy.

We must not now allow what amounts to a Leveson-style state-controlled regulator for the Press by the back door.

He was backed by Charles Walker, vice-chairman of the Tory Party's powerful backbench 1922 Committee, who said:

We need to protect people from the well-documented evils of the internet -- not in order to suppress views or opinions to which they might object.

In last week's Mail on Sunday, former Culture Secretary John Whittingdale warned that the legislation was more usually associated with autocratic regimes including those in China, Russia or North Korea.

Tory MP Philip Davies joined the criticism last night, saying:

Of course people need to be protected from the worst excesses of what takes place online. But equally, free speech in a free country is very, very important too. It's vital we strike the right balance. While I have every confidence that Sajid Javid as Home Secretary would strike that balance, can I have the same confidence that a future Marxist government would not abuse the proposed new powers?

And Tory MP Martin Vickers added:

While we must take action to curb the unregulated wild west of the internet, we must not introduce state control of the Press as a result.

 

 

Updated Comments: The UK Government harms the British people...

The press and campaigners call out the Online Harms white paper for what it is...censorship


Link Here 12th April 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Newspapers and the press have generally given the new internet censorship proposals a jistifiable negative reception:

The Guardian

See Internet crackdown raises fears for free speech in Britain from theguardian.com

Critics of the government's flagship internet regulation policy are warning it could lead to a North Korean-style censorship regime, where regulators decide which websites Britons are allowed to visit, because of how broad the proposals are.

The Daily Mail

See New internet regulation laws will lead to widespread censorship from dailymail.co.uk

Critics brand new internet regulation laws the most draconian crackdown in the Western democratic world as they warn it could threaten the freedom of speech of millions of Britons

The Independent

See UK's new internet plans could bring state censorship of the internet, campaigners warn from independent. co.uk

The government's new proposals to try and protect people from harm on the internet could actually create a huge censorship operation, campaigners have warned.

Index on Censorship

See Online harms proposals pose serious risks to freedom of expressionfrom indexoncensorship.org

Index on Censorship has raised strong concerns about the government's focus on tackling unlawful and harmful online content, particularly since the publication of the Internet Safety Strategy Green Paper in 2017. In October 2018, Index published a joint statement with Global Partners Digital and Open Rights Group noting that any proposals that regulate content are likely to have a significant impact on the enjoyment and exercise of human rights online, particularly freedom of expression.

We have also met with officials from the Department for Digital, Culture, Media and Sport, as well as from the Home Office, to raise our thoughts and concerns.

With the publication of the Online Harms White Paper , we would like to reiterate our earlier points.

While we recognise the government's desire to tackle unlawful content online, the proposals mooted in the white paper -- including a new duty of care on social media platforms , a regulatory body , and even the fining and banning of social media platforms as a sanction -- pose serious risks to freedom of expression online.

These risks could put the United Kingdom in breach of its obligations to respect and promote the right to freedom of expression and information as set out in Article 19 of the International Covenant on Civil and Political Rights and Article 10 of the European Convention on Human Rights, amongst other international treaties.

Social media platforms are a key means for tens of millions of individuals in the United Kingdom to search for, receive, share and impart information, ideas and opinions. The scope of the right to freedom of expression includes speech which may be offensive, shocking or disturbing . The proposed responses for tackling online safety may lead to disproportionate amounts of legal speech being curtailed, undermining the right to freedom of expression.

In particular, we raise the following concerns related to the white paper:

  • Lack of evidence base

The wide range of different harms which the government is seeking to tackle in this policy process require different, tailored responses. Measures proposed must be underpinned by strong evidence, both of the likely scale of the harm and the measures' likely effectiveness. The evidence which formed the base of the Internet Safety Strategy Green Paper was highly variable in its quality. Any legislative or regulatory measures should be supported by clear and unambiguous evidence of their need and effectiveness.

  • Duty of care concerns/ problems with 'harm' definition

Index is concerned at the use of a duty of care regulatory approach. Although social media has often been compared the public square, the duty of care model is not an exact fit because this would introduce regulation -- and restriction -- of speech between individuals based on criteria that is far broader than current law. A failure to accurately define "harmful" content risks incorporating legal speech, including political expression, expressions of religious views, expressions of sexuality and gender, and expression advocating on behalf of minority groups.

  • Risks in linking liability/sanctions to platforms over third party content

While well-meaning, proposals such as these contain serious risks, such as requiring or incentivising wide-sweeping removal of lawful and innocuous content. The imposition of time limits for removal, heavy sanctions for non-compliance or incentives to use automated content moderation processes only heighten this risk, as has been evidenced by the approach taken in Germany via its Network Enforcement Act (or NetzDG), where there is evidence of the over-removal of lawful content.

  • Lack of sufficient protections for freedom of expression.

The obligation to protect users' rights online that is included in the white paper gives insufficient weight to freedom of expression. A much clearer obligation to protect freedom of expression should guide development of future regulation.

In recognition of the UK's commitment to the multistakeholder model of internet governance, we hope all relevant stakeholders, including civil society experts on digital rights and freedom of expression, will be fully engaged throughout the development of the Online Harms bill.

Privacy International

See  PI's take on the UK government's new proposal to tackle "online harms" from privacyinternational.org

PI welcomes the UK government's commitment to investigating and holding companies to account. When it comes to regulating the internet, however, we must move with care. Failure to do so will introduce, rather than reduce, "online harms". A 12-week consultation on the proposals has also been launched today. PI plans to file a submission to the consultation as it relates to our work. Given the breadth of the proposals, PI calls on others respond to the consultation as well.

Here are our initial suggestions:

  • proceed with care: proposals of regulation of content on digital media platforms should be very carefully evaluated, given the high risks of negative impacts on expression, privacy and other human rights. This is a very complex challenge and we support the need for broad consultation before any legislation is put forward in this area.

  • do not lose sight of how data exploitation facilitates the harms identified in the report and ensure any new regulator works closely with others working to tackle these issues.

  • assess carefully the delegation of sole responsibility to companies as adjudicators of content. This would empower corporate judgment over content, with would have implications for human rights, particularly freedom of expression and privacy.

  • require that judicial or other independent authorities, rather than government agencies, are the final arbiters of decisions regarding what is posted online and enforce such decisions in a manner that is consistent with human rights norms.

  • assess the privacy implications of any demand for "proactive" monitoring of content in digital media platforms.

  • ensure that any requirement or expectation of deploying automated decision making/AI is in full compliance with existing human rights and data protection standards (which, for example, prohibit, with limited exceptions, relying on solely automated decisions, including profiling, when they significantly affect individuals).

  • ensure that company transparency reports include information related to how the content was targeted at users.

  • require companies to provide efficient reporting tools in multiple languages, to report on action taken with regard to content posted online. Reporting tools should be accessible, user-friendly, and easy to find. There should be full transparency regarding the complaint and redress mechanisms available and opportunities for civil society to take action.

Offsite Comment: Ridiculous Plan

10th April 2019. See article from techdirt.com

UK Now Proposes Ridiculous Plan To Fine Internet Companies For Vaguely Defined Harmful Content

Last week Australia rushed through a ridiculous bill to fine internet companies if they happen to host any abhorrent content. It appears the UK took one look at that nonsense and decided it wanted some too. On Monday it released a white paper calling for massive fines for internet companies for allowing any sort of online harms. To call the plan nonsense is being way too harsh to nonsense

The plan would result in massive, widespread, totally unnecessary censorship solely for the sake of pretending to do something about the fact that some people sometimes do not so nice things online. And it will place all of the blame on the internet companies for the (vaguely defined) not so nice things that those companies' users might do online.

Read the full article from techdirt.com

Offsite Comment: Sajid Javid's new internet rules will have a chilling effect on free speech

11th April 2019. See article from spectator.co.uk by Toby Young

How can the government prohibit comments that might cause harm without defining what harm is?

Offsite Comment: Plain speaking from Chief Censor Sajid Javid

11th April 2019. See tweet from twitter.com

Letter to the Guardian: Online Harms white paper would make Chinese censors proud

11th April 2019. See article from theguardian.com

We agree with your characterisation of the online harms white paper as a flawed attempt to deal with serious problems (Regulating the internet demands clear thought about hard problems, Editorial, 9 April). However, we would draw your attention to several fundamental problems with the proposal which could be disastrous if it proceeds in its current form.

Firstly, the white paper proposes to regulate literally the entire internet, and censor anything non-compliant. This extends to blogs, file services, hosting platforms, cloud computing; nothing is out of scope.

Secondly, there are a number of undefined harms with no sense of scope or evidence thresholds to establish a need for action. The lawful speech of millions of people would be monitored, regulated and censored.

The result is an approach that would make China's state censors proud. It would be very likely to face legal challenge. It would give the UK the widest and most prolific internet censorship in an apparently functional democracy. A fundamental rethink is needed.

Antonia Byatt Director, English PEN,
Silkie Carlo Big Brother Watch
Thomas Hughes Executive director, Article 19
Jim Killock Executive director, Open Rights Group
Joy Hyvarinen Head of advocacy, Index on Censorship

Comment: The DCMS Online Harms Strategy must design in fundamental rights

12th April 2019. See article from openrightsgroup.org

Increasingly over the past year, DCMS has become fixated on the idea of imposing a duty of care on social media platforms, seeing this as a flexible and de-politicised way to emphasise the dangers of exposing children and young people to certain online content and make Facebook in particular liable for the uglier and darker side of its user-generated material.

DCMS talks a lot about the 'harm' that social media causes. But its proposals fail to explain how harm to free expression impacts would be avoided.

On the positive side, the paper lists free expression online as a core value to be protected and addressed by the regulator. However, despite the apparent prominence of this value, the mechanisms to deliver this protection and the issues at play are not explored in any detail at all.

In many cases, online platforms already act as though they have a duty of care towards their users. Though the efficacy of such measures in practice is open to debate, terms and conditions, active moderation of posts and algorithmic choices about what content is pushed or downgraded are all geared towards ousting illegal activity and creating open and welcoming shared spaces. DCMS hasn't in the White Paper elaborated on what its proposed duty would entail. If it's drawn narrowly so that it only bites when there is clear evidence of real, tangible harm and a reason to intervene, nothing much will change. However, if it's drawn widely, sweeping up too much content, it will start to act as a justification for widespread internet censorship.

If platforms are required to prevent potentially harmful content from being posted, this incentivises widespread prior restraint. Platforms can't always know in advance the real-world harm that online content might cause, nor can they accurately predict what people will say or do when on their platform. The only way to avoid liability is to impose wide-sweeping upload filters. Scaled implementation of this relies on automated decision-making and algorithms, which risks even greater speech restrictions given that machines are incapable of making nuanced distinctions or recognising parody or sarcasm.

DCMS's policy is underpinned by societally-positive intentions, but in its drive to make the internet "safe", the government seems not to recognise that ultimately its proposals don't regulate social media companies, they regulate social media users. The duty of care is ostensibly aimed at shielding children from danger and harm but it will in practice bite on adults too, wrapping society in cotton wool and curtailing a whole host of legal expression.

Although the scheme will have a statutory footing, its detail will depend on codes of practice drafted by the regulator. This makes it difficult to assess how the duty of care framework will ultimately play out.

The duty of care seems to be broadly about whether systemic interventions reduce overall "risk". But must the risk be always to an identifiable individual, or can it be broader - to identifiable vulnerable groups? To society as a whole? What evidence of harm will be required before platforms should intervene? These are all questions that presently remain unanswered.

DCMS's approach appears to be that it will be up to the regulator to answer these questions. But whilst a sensible regulator could take a minimalist view of the extent to which commercial decisions made by platforms should be interfered with, allowing government to distance itself from taking full responsibility over the fine detailing of this proposed scheme is a dangerous principle. It takes conversations about how to police the internet out of public view and democratic forums. It enables the government to opt not to create a transparent, judicially reviewable legislative framework. And it permits DCMS to light the touch-paper on a deeply problematic policy idea without having to wrestle with the practical reality of how that scheme will affect UK citizens' free speech, both in the immediate future and for years to come.

How the government decides to legislate and regulate in this instance will set a global norm.

The UK government is clearly keen to lead international efforts to regulate online content. It knows that if the outcome of the duty of care is to change the way social media platforms work that will apply worldwide. But to be a global leader, DCMS needs to stop basing policy on isolated issues and anecdotes and engage with a broader conversation around how we as society want the internet to look. Otherwise, governments both repressive and democratic are likely to use the policy and regulatory model that emerge from this process as a blueprint for more widespread internet censorship.

The House of Lords report on the future of the internet, published in early March 2019, set out ten principles it considered should underpin digital policy-making, including the importance of protecting free expression. The consultation that this White Paper introduces offers a positive opportunity to collectively reflect, across industry, civil society, academia and government, on how the negative aspects of social media can be addressed and risks mitigated. If the government were to use this process to emphasise its support for the fundamental right to freedom of expression - and in a way that goes beyond mere expression of principle - this would also reverberate around the world, particularly at a time when press and journalistic freedom is under attack.

The White Paper expresses a clear desire for tech companies to "design in safety". As the process of consultation now begins, we call on DCMS to "design in fundamental rights". Freedom of expression is itself a framework, and must not be lightly glossed over. We welcome the opportunity to engage with DCMS further on this topic: before policy ideas become entrenched, the government should consider deeply whether these will truly achieve outcomes that are good for everyone.

 

 

Ensuring that the UK is the most censored place in the western world to be online...

Government introduces an enormous package of internet censorship proposals


Link Here8th April 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
  The Government writes:

In the first online safety laws of their kind, social media companies and tech firms will be legally required to protect their users and face tough penalties if they do not comply.

As part of the Online Harms White Paper, a joint proposal from the Department for Digital, Culture, Media and Sport and Home Office, a new independent regulator will be introduced to ensure companies meet their responsibilities.

This will include a mandatory 'duty of care', which will require companies to take reasonable steps to keep their users safe and tackle illegal and harmful activity on their services. The regulator will have effective enforcement tools, and we are consulting on powers to issue substantial fines, block access to sites and potentially to impose liability on individual members of senior management.

A range of harms will be tackled as part of the Online Harms White Paper , including inciting violence and violent content, encouraging suicide, disinformation, cyber bullying and children accessing inappropriate material.

There will be stringent requirements for companies to take even tougher action to ensure they tackle terrorist and child sexual exploitation and abuse content.

The new proposed laws will apply to any company that allows users to share or discover user generated content or interact with each other online. This means a wide range of companies of all sizes are in scope, including social media platforms, file hosting sites, public discussion forums, messaging services, and search engines.

A regulator will be appointed to enforce the new framework. The Government is now consulting on whether the regulator should be a new or existing body. The regulator will be funded by industry in the medium term, and the Government is exploring options such as an industry levy to put it on a sustainable footing.

A 12 week consultation on the proposals has also been launched today. Once this concludes we will then set out the action we will take in developing our final proposals for legislation.

Tough new measures set out in the White Paper include:

  • A new statutory 'duty of care' to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services.

  • Further stringent requirements on tech companies to ensure child abuse and terrorist content is not disseminated online.

  • Giving a regulator the power to force social media platforms and others to publish annual transparency reports on the amount of harmful content on their platforms and what they are doing to address this.

  • Making companies respond to users' complaints, and act to address them quickly.

  • Codes of practice, issued by the regulator, which could include measures such as requirements to minimise the spread of misleading and harmful disinformation with dedicated fact checkers, particularly during election periods.

  • A new "Safety by Design" framework to help companies incorporate online safety features in new apps and platforms from the start.

  • A media literacy strategy to equip people with the knowledge to recognise and deal with a range of deceptive and malicious behaviours online, including catfishing, grooming and extremism.

The UK remains committed to a free, open and secure Internet. The regulator will have a legal duty to pay due regard to innovation, and to protect users' rights online, being particularly mindful to not infringe privacy and freedom of expression.

Recognising that the Internet can be a tremendous force for good, and that technology will be an integral part of any solution, the new plans have been designed to promote a culture of continuous improvement among companies. The new regime will ensure that online firms are incentivised to develop and share new technological solutions, like Google's "Family Link" and Apple's Screen Time app, rather than just complying with minimum requirements. Government has balanced the clear need for tough regulation with its ambition for the UK to be the best place in the world to start and grow a digital business, and the new regulatory framework will provide strong protection for our citizens while driving innovation by not placing an impossible burden on smaller companies.

 

 

Offsite Article: Why an internet regulator is a bad idea...


Link Here 20th March 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
We should be stripping away curbs on speech -- not adding more. By Andrew Tettenborn

See article from spiked-online.com

 

 

Six shooters...

Internet giants respond to impending government internet censorship laws with sex principles that should be followed


Link Here1st March 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
The world's biggest internet companies including Facebook, Google and Twitter are represented by a trade group call The Internet Association. This organisation has written to UK government ministers to outline how they believe harmful online activity should be regulated.

The letter has been sent to the culture, health and home secretaries. The letter will be seen as a pre-emptive move in the coming negotiation over new rules to govern the internet. The government is due to publish a delayed White Paper on online harms in the coming weeks.

The letter outlines six principles:

  • "Be targeted at specific harms, using a risk-based approach
  • "Provide flexibility to adapt to changing technologies, different services and evolving societal expectations
  • "Maintain the intermediary liability protections that enable the internet to deliver significant benefits for consumers, society and the economy
  • "Be technically possible to implement in practice
  • "Provide clarity and certainty for consumers, citizens and internet companies
  • "Recognise the distinction between public and private communication"

Many leading figures in the UK technology sector fear a lack of expertise in government, and hardening public sentiment against the excesses of the internet, will push the Online Harms paper in a more radical direction.

Three of the key areas of debate are the definition of online harm, the lack of liability for third-party content, and the difference between public and private communication.

The companies insist that government should recognise the distinction between clearly illegal content and content which is harmful, but not illegal. If these leading tech companies believe this government definition of harm is too broad, their insistence on a distinction between illegal and harmful content may be superseded by another set of problems.

The companies also defend the principle that platforms such as YouTube permit users to post and share information without fear that those platforms will be held liable for third-party content. Another area which will be of particular interest to the Home Office is the insistence that care should be taken to avoid regulation encroaching into the surveillance of private communications.

 

 

Putting Zuckerberg behind bars...

The Telegraph reports on the latest government thoughts about setting up a social media censor


Link Here 23rd February 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media

Social media companies face criminal sanctions for failing to protect children from online harms, according to drafts of the Government's White Paper circulating in Whitehall.

Civil servants are proposing a new corporate offence as an option in the White Paper plans for a tough new censor with the power to force social media firms to take down illegal content and to police legal but harmful material.

They see criminal sanctions as desirable and as an important part of a regulatory regime, said one source who added that there's a recognition particularly on the Home Office side that this needs to be a regulator with teeth. The main issue they need to satisfy ministers on is extra-territoriality, that is can you apply this to non-UK companies like Facebook and YouTube? The belief is that you can.

The White Paper, which is due to published mid-March followed by a Summer consultation, is not expected to lay out as definitive a plan as previously thought. A decision on whether to create a brand new censor or use Ofcom is expected to be left open. A Whitehall source said:

Criminal sanctions are going to be put into the White Paper as an option. We are not necessarily saying we are going to do it but these are things that are open to us. They will be allied to a system of fines amounting to 4% of global turnover or Euros 20m, whichever is higher.

Government minister Jeremy Wright told the Telegraph this week he was especially focused on ensuring that technology companies enforce minimum age standards. He also indicated the Government w ould fulfill a manifesto commitment to a levy on social media firms, that could fund the new censorr.

 

 

Wider definition of harm can be manipulated to restrict media freedom...

Index on Censorship responds to government plans to create a UK internet censor


Link Here22nd February 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media

Index on Censorship welcomes a report by the House of Commons Digital, Culture, Media and Sport select committee into disinformation and fake news that calls for greater transparency on social media companies' decision making processes, on who posts political advertising and on use of personal data. However, we remain concerned about attempts by government to establish systems that would regulate harmful content online given there remains no agreed definition of harm in this context beyond those which are already illegal.

Despite a number of reports, including the government's Internet Safety Strategy green paper, that have examined the issue over the past year, none have yet been able to come up with a definition of harmful content that goes beyond definitions of speech and expression that are already illegal. DCMS recognises this in its report when it quotes the Secretary of State Jeremy Wright discussing the difficulties surrounding the definition. Despite acknowledging this, the report's authors nevertheless expect technical experts to be able to set out what constitutes harmful content that will be overseen by an independent regulator.

International experience shows that in practice it is extremely difficult to define harmful content in such a way that would target only bad speech. Last year, for example, activists in Vietnam wrote an open letter to Facebook complaining that Facebook's system of automatically pulling content if enough people complained could silence human rights activists and citizen journalists in Vietnam , while Facebook has shut down the livestreams of people in the United States using the platform as a tool to document their experiences of police violence.

Index on Censorship chief executive Jodie Ginsberg said:

It is vital that any new system created for regulating social media protects freedom of expression, rather than introducing new restrictions on speech by the back door. We already have laws to deal with harassment, incitement to violence, and incitement to hatred. Even well-intentioned laws meant to tackle hateful views online often end up hurting the minority groups they are meant to protect, stifle public debate, and limit the public's ability to hold the powerful to account.

The select committee report provides the example of Germany as a country that has legislated against harmful content on tech platforms. However, it fails to mention the German Network Reinforcement Act was legislating on content that was already considered illegal, nor the widespread criticism of the law that included the UN rapporteur on freedom of expression and groups such as Human Rights Watch. It also cites the fact that one in six of Facebook's moderators now works in Germany as practical evidence that legislation can work. Ginsberg said:

The existence of more moderators is not evidence that the laws work. Evidence would be if more harmful content had been removed and if lawful speech flourished. Given that there is no effective mechanism for challenging decisions made by operators, it is impossible to tell how much lawful content is being removed in Germany. But the fact that Russia, Singapore and the Philippines have all cited the German law as a positive example of ways to restrict content online should give us pause.

Index has reported on various examples of the German law being applied incorrectly, including the removal of a tweet of journalist Martin Eimermacher criticising the double standards of tabloid newspaper Bild Zeitung and the blocking of the Twitter account of German satirical magazine Titanic. The Association of German Journalists (DJV) has said the Twitter move amounted to censorship, adding it had warned of this danger when the German law was drawn up.

Index is also concerned about the continued calls for tools to distinguish between quality journalism and unreliable sources, most recently in the Cairncross Review . While we recognise that the ability to do this as individuals and through education is key to democracy, we are worried that a reliance on a labelling system could create false positives, and mean that smaller or newer journalism outfits would find themselves rejected by the system.

 

 

Driving the internet into dark corners...

The IWF warns the government to think about unintended consequences when creating a UK internet censor


Link Here 22nd February 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media

Internet Watch Foundation's (IWF) CEO, Susie Hargreaves OBE, puts forward a voice of reason by urging politicians and policy makers to take a balanced approach to internet regulation which avoids a heavy cost to the victims of child sexual abuse.

IWF has set out its views on internet regulation ahead of the publication of the Government's Online Harms White Paper. It suggests that traditional approaches to regulation cannot apply to the internet and that human rights should play a big role in any regulatory approach.

The IWF, as part of the UK Safer Internet Centre, supports the Government's ambition to make the UK the safest place in the world to go online, and the best place to start a digital business.

IWF has a world-leading reputation in identifying and removing child sexual abuse images and videos from the internet. It takes a co-regulatory approach to combating child sexual abuse images and videos by working in partnership with the internet industry, law enforcement and governments around the world. It offers a suite of tools and services to the online industry to keep their networks safer. In the past 22 years, the internet watchdog has assessed -- with human eyes -- more than 1 million reports.

Ms Hargreaves said:

Tackling criminal child sexual abuse material requires a global multi-stakeholder effort. We'll use our 22 years' experience in this area to help the government and policy makers to shape a regulatory framework which is sustainable and puts victims at its heart. In order to do this, any regulation in this area should be developed with industry and other key stakeholders rather than imposed on them.

We recommend an outcomes-based approach where the outcomes are clearly defined and the government should provide clarity over the results it seeks in dealing with any harm. There also needs to be a process to monitor this and for any results to be transparently communicated.

But, warns Ms Hargreaves, any solutions should be tested with users including understanding impacts on victims: "The UK already leads the world at tackling online child sexual abuse images and videos but there is definitely more that can be done, particularly in relation to tackling grooming and livestreaming, and of course, regulating harmful content is important.

My worries, however, are about rushing into knee-jerk regulation which creates perverse incentives or unintended consequences to victims and could undo all the successful work accomplished to date. Ultimately, we must avoid a heavy cost to victims of online sexual abuse.

 

 

Duty of care: an empty concept...

The Open Rights Group comments on government moves to create a social media censor


Link Here 9th February 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media

There is every reason to believe that the government and opposition are moving to a consensus on introducing a duty of care for social media companies to reduce harm and risk to their users. This may be backed by an Internet regulator, who might decide what kind of mitigating actions are appropriate to address the risks to users on different platforms.

This idea originated from a series of papers by Will Perrin and Lorna Woods and has been mentioned most recently in a recent Science and Technology committee report and by NGOs including children's charity 5Rights.

A duty of care has some obvious merits: it could be based on objective risks, based on evidence, and ensure that mitigations are proportionate to those risks. It could take some of the politicisation out of the current debate.

However, it also has obvious problems. For a start, it focuses on risk rather than process . It moves attention away from the fact that interventions are regulating social media users just as much as platforms. It does not by itself tell us that free expression impacts will be considered, tracked or mitigated.

Furthermore, the lack of focus that a duty of care model gives to process means that platform decisions that have nothing to do with risky content are not necessarily based on better decisions, independent appeals and so on. Rather, as has happened with German regulation, processes can remain unaffected when they are outside a duty of care.

In practice, a lot of content which is disturbing or offensive is already banned on online platforms. Much of this would not be in scope under a duty of care but it is precisely these kinds of material which users often complain about, when it is either not removed when they want it gone, or is removed incorrectly. Any model of social media regulation needs to improve these issues, but a duty of care is unlikely to touch these problems.

There are very many questions about the kinds of risk, whether to individual in general, vulnerable groups, or society at large; and the evidence required to create action. The truth is that a duty of care, if cast sensibly and narrowly, will not satisfy many of the people who are demanding action; equally, if the threshold to act is low, then it will quickly be seen to be a mechanism for wide-scale Internet censorship.

It is also a simple fact that many decisions that platforms make about legal content which is not risky are not the business of government to regulate. This includes decisions about what legal content is promoted and why. For this reason, we believe that a better approach might be to require independent self-regulation of major platforms across all of their content decisions. This requirement could be a legislative one, but the regulator would need to be independent of government and platforms.

Independent self-regulation has not been truly tried. Instead, voluntary agreements have filled its place. We should be cautious about moving straight to government regulation of social media and social media users. The government refuses to regulate the press in this way because it doesn't wish to be seen to be controlling print media. It is pretty curious that neither the media nor the government are spelling out the risks of state regulation of the speech of millions of British citizens.

That we are in this place is of course largely the fault of the social media platforms themselves, who have failed to understand the need and value of transparent and accountable systems to ensure they are acting properly. That, however, just demonstrates the problem: politically weak platforms who have created monopoly positions based on data silos are now being sliced and diced at the policy table for their wider errors. It's imperative that as these government proposals progress we keep focus on the simple fact that it is end users whose speech will ultimately be regulated.

 

 

Offsite Article: A Lord Chamberlain for the internet?...


Link Here 8th February 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Thanks, but no thanks. By Graham Smith

See article from cyberleagle.com

 

 

Updated: As always increased red tape benefits the largest (ie US) companies...

Daily Mail reports on government discussion about a new internet censor, codenamed Ofweb


Link Here6th February 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
Wrangling in Whitehall has held up plans to set up a social media censor dubbed Ofweb, The Mail on Sunday reveals.

The Government was due to publish a White Paper this winter on censorship of tech giants but this Mail has learnt it is still far from ready. Culture Secretary Jeremy Wright said it would be published within a month, but a Cabinet source said that timeline was wholly unrealistic. Other senior Government sources went further and said the policy document is unlikely to surface before the Spring.

Key details on how a new censor would work have yet to be decided while funding from the Treasury has not yet been secured. Another problem is that some Ministers believe the proposed clampdown is too draconian and are preparing to try to block or water down the plan.

There are also concerns that technically difficult requirements would benefit the largest US companies as smaller European companies and start ups would not be able to afford the technology and development required.

The Mail on Sunday understands Jeremy Wright has postponed a visit to Facebook HQ in California to discuss the measures, as key details are still up in the air.

Update: The Conservatives don't have a monopoly on internet censorship...Labour agrees

6th February 2019. See  article from ft.com

Labour has called for a new entity capable of taking on the likes of Facebook and Google. Tom Watson, the shadow digital secretary, will on Wednesday say a regulator should also have responsibility for competition policy and be able to refer cases to the Competition and Markets Authority.

According to Watson, any duty of care would only be effective with penalties that seriously affect companies' bottom lines. He has referred to regulators' ability to fine companies up to 4% of global turnover, or euro 20m, whichever is higher, for worst-case breaches of the EU-wide General Data Protection Regulation.

 

 

Policing the wild west...

Status report on the government's plans to introduce an internet censor for social media


Link Here 30th January 2019
Full story: Online Harms White Paper...UK Government seeks to censor social media
The U.K. government is rushing to finalize a draft internet censorship law particularly targeting social media but key details of the proposal have yet to be finalised amid concerns about stifling innovation.

Government officials have been meeting with industry players, MPs, peers and other groups over the past month as they try to finalise their proposals.

People involved in those discussions said there is now broad agreement about the need to impose a new duty of care on big tech companies, as well as the need to back up their terms and conditions with the force of law.

A white paper is due be published by the end of winter. But the Department for Digital, Culture, Media and Sport, which is partly responsible for writing up the new rules alongside the Home Office, is still deliberating over key aspects with just weeks to go until the government said it would unveil an outline of its proposals.

Among the sticking points are worries that regulation could stifle innovation in one of the U.K. economy's most thriving sectors and concerns over whether it can keep pace with rapid technological change. Another is ensuring sufficient political support to pass the law despite likely opposition from parts of the Conservative Party. A third is deciding what regulatory agency would ultimately be responsible for enforcing the so-called Internet Safety Law.

A major unresolved question is what censorship body will be in charge of enforcing laws that could expose big tech companies to greater liability for hosted content, a prospect that firms including Google and Facebook have fought at the European level.

Several people who spoke to POLITICO said the government does not appear to have settled on who would be the censor, although the communications regulator Ofcom is very much in the mix, however there are concerns that Ofcom is already getting too big.




 

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys