|
Civil society organisations urge UK to protect global digital security and safeguard private communication
|
|
|
| 28th
June 2023
|
|
| See article from openrightsgroup.org
|
To: Chloe Smith, Secretary of State, Department for Science, Innovation and Technology cc: Tom Tugendhat, Minister of State for Security, Home Office Paul Scully, Minister for Tech and the Digital Economy Lord Parkinson of Whitley Bay
Dear Ms Smith, We are over 80 national and international civil society organisations, academics and cyberexperts. We represent a wide range of perspectives including digital human rights and technology.
We are writing to you to raise our concerns about the serious threat to the security of private and encrypted messaging posed by the UK's proposed Online Safety Bill (OSB). The Online Safety Bill is a
deeply troubling legislative proposal. If passed in its present form, the UK could become the first liberal democracy to require the routine scanning of people's private chat messages, including chats that are secured by end-to-end encryption. As over 40
million UK citizens and 2 billion people worldwide rely on these services, this poses a significant risk to the security of digital communication services not only in the UK, but also internationally. End-to-end encryption ensures
the security of communications for everyone on a network. It is designed so that no-one, including the platform provider, can read or alter the messages. The confidentiality between sender and recipient is completely preserved. That's why the United
Nations, several human rights groups, and anti-human trafficking organisations alike have emphasised that encryption is a vital human rights tool. In order to comply with the Online Safety Bill, platform providers would have to
break that protection either by removing it or by developing work-arounds. Any form of work-around risks compromising the security of the messaging platform, creating back-doors, and other dangerous ways and means for malicious actors and hostile states
to corrupt the system. This would put all users in danger. The UK government has indicated its intention for providers to use a technology that would scan chats on people's phone and devices -- known as client-side scanning. The
UK government's assertion that client-side scanning will not compromise the privacy of messages contradicts the significant evidence of cyber-security experts around the world. This software intercepts chat messages before they are encrypted, and as the
user is uploading their images or text, and therefore confidentiality of messages cannot be guaranteed. It would most likely breach human rights law in the UK and internationally. Serious concerns have also been raised about
similar provisions in the EU's proposed Child Sexual Abuse Regulation, which an independent expert study warns is in contradiction to human rights rules. French, Irish and Austrian parliamentarians have all also warned of severe threats to human rights
and of undermining encryption. Moreover, the scanning software would have to be pre-installed on people's phones, without their permission or full awareness of the severe privacy and security implications. The underlying databases
can be corrupted by hostile actors, meaning that individual phones would become vulnerable to attack. The breadth of the measures proposed in the Online Safety Bill -- which would infringe the rights to privacy to the same extent for the internet's
majority of legitimate law-abiding users as it would for potential criminals -- means that the measures cannot be considered either necessary or proportionate. The inconvenient truth is that it is not possible to scan messages for
bad things without infringing on the privacy of lawful messages. It is not possible to create a backdoor that only works for good people and that cannot be exploited by bad people. Privacy and free expression rights are vital for
all citizens everywhere, in every country, to do their jobs, raise their voices, and hold power to account without arbitrary intrusion, persecution or repression. End-to-end encryption provides vital security that allows them to do that without arbitrary
interference. People in conflict zones who rely on secure encrypted communications to be able to speak safely to friends and family as well as for national security. Journalists around the world who rely on the confidential channels of encrypted chat,
can communicate to sources and upload their stories in safety. Children, too, need these rights, as emphasised by UNICEF based on the UN Convention of the Rights of the Child. Child safety and privacy are not mutually exclusive;
they are mutually reinforcing. Indeed, children are less safe without encrypted communications, as they equally rely on secure digital experiences free from their data being harvested or conversations intercepted. Online content scanning alone cannot
hope to fish out the serious cases of exploitation, which require a whole-of-society approach. The UK government must invest in education, judicial reform, social services, law enforcement and other critical resources to prevent abuse before it can reach
the point of online dissemination, thereby prioritising harm prevention over retrospective scanning. As an international community, we are deeply concerned that the UK will become the weak link in the global system. The security
risk will not be confined within UK borders. It is difficult to envisage how such a destructive step for the security of billions of users could be justified. The UK Prime Minister, Rishi Sunak, has said that the UK will maintain
freedom, peace and security around the world. With that in mind, we urge you to ensure that end-to-end encrypted services will be removed from the scope of the Bill and that the privacy of people's confidential communications will be upheld.
Signed, Access Now, ARTICLE 19: Global Campaign for Free Expression, Asociatia pentru Tehnologie Ui Internet (ApTI), Associação Portuguesa para a Promoção da Segurança da Informação (AP2SI), Association for
Progressive Communications (APC), Big Brother Watch, Centre for Democracy and Technology, Chaos Computer Club (CCC), Citizen D / Drzavljan D, Collaboration on International ICT Policy for East and Southern Africa (CIPESA), Community NeHUBs Africa,
cyberstorm.mu, Defend Digital Me, CASM at Demos, Digitalcourage, Digitale Gesellschaft, DNS Africa Media and Communications, Electronic Frontier Finland, Electronic Frontier Foundation (EFF), Electronic Frontier Norway, Epicenter.works, European Center
for Not-for-Profit Law, European Digital Rights (EDRi), European Sex Workers Rights Association (ESWA), Fair Vote, Fight for the Future, Foundation for Information Policy Research, Fundación Cibervoluntarios, Global Partners Digital, Granitt, Hermes
Center for Transparency and Digital Human Rights, Homo Digitalis, Ikigai Innovation Initiative, Internet Society, Interpeer gUG, ISOC Brazil -- Brazilian Chapter of the Internet Society, ISOC Ghana, ISOC India Hyderabad Chapter, ISOC Venezuela, IT-Pol,
JCA-Net (Japan), Kijiji Yeetu, La Quadrature du Net, Liberty, McEvedys Solicitors and Attorneys Ltd, Open Rights Group, OpenMedia, OPTF, Privacy and Access Council of Canada, Privacy International, Ranking Digital Rights, Statewatch, SUPERRR Lab, Tech
for Good Asia, UBUNTEAM, Wikimedia Foundation, Wikimedia UK Professor Paul Bernal, Nicholas Bohm, Dr Duncan Campbell, Alan Cox, Ray Corrigan, Professor Angela Daly, Dr Erin Ferguson, Wendy M. Grossman, Dr Edina Harbinja, Dr Julian
Huppert, Steve Karmeinsky, Dr Konstantinos Komaitis, Professor Douwe Korff, Petr Kucera, Mark A. Lane, Christian de Larrinaga, Mark Lizar, Dr Brenda McPhail, Alec Muffett, Riana Pferfferkorn, Simon Phipps, Dr Birgit Schippers, Peter Wells, Professor Alan
Woodward
|
|
Industry proposed guidelines rejected as they don't allow the state to snoop on messages (along with Russians, Chinese, hackers, scammers and thieves)
|
|
|
| 20th June 2023
|
|
| See press release
from esafety.gov.au |
Australia's eSafety Commissioner has made the decision not to register two of eight online censorship codes drafted by the online industry as they fail to provide appropriate mechanisms to deal with illegal and harmful content online. New mandatory
codes will cover five sections of the online industry and operate under Australia's Online Safety Act 2021. The codes require industry to take adequate steps to reduce the availability of seriously harmful online content, such as child sexual abuse and
pro-terror material. eSafety's decision not to register the Designated Internet Services (DIS) code, covering apps, websites, and file and photo storage services like Apple iCloud and Microsoft One Drive; and the Relevant Electronic Services (RES)
code, covering dating sites, online games and instant messaging, is due to the failure of the codes to define appropriate snooping/surveillance mechanisms, which is a requirement for registration . eSafety will now move to develop mandatory and
enforceable industry standards for Relevant Electronic Services and Designated Internet Services. The eSafety Commissioner has reserved her decision on a third code, the draft Search Engines code, covering online search over concerns it is no
longer fit for purpose following recently announced developments in the field of generative AI and its integration into search engine functions. eSafety has requested that a revised Search Engines code be submitted within four weeks to address specific
concerns we have raised. eSafety Commissioner Julie Inman Grant said: While I commend industry for their significant amendments following our final feedback on these world-first codes in February, these two
codes still don't meet our minimum expectations. For example, the Designated Internet Services code still doesn't require file and photo storage services like iCloud, Google Drive, or OneDrive to detect and flag known child sexual
abuse material. We know that online storage services like these are used to store and share child sexual abuse material and pro-terror material between offenders. And the Relevant Electronic Services code
also doesn't require email services and some partially encrypted messaging services to detect and flag this material either, even though we know there are proactive steps they can take to stem the already rampant sharing of illegal content.
Industry codes will come into effect six months from the date of registration while eSafety will begin the process of drafting industry standards for Designated Internet Services and Relevant Electronic Services. Once a code or
standard is in place, eSafety will be able to receive complaints and investigate potential breaches. An industry code or standard will be backed up by powers to ensure compliance including injunctions, enforceable undertakings, and maximum financial
penalties of nearly $700,000 per day for continuing breaches. The draft industry censorship codes submitted to eSafety on 31 March can be found at onlinesafety.org.au/codes
. |
|
The USA takes the lead from the UK Online 'Safety' Bill with its own 1984 snooping bill
|
|
|
| 22nd
April 2023
|
|
| See Creative Commons article from eff.org
by Joe Mullin Take Action Protect Our Privacy--Stop "EARN IT" |
The EARN IT Bill Is Seeking To Scan Our Messages and Photos. In a free society, people should not have their private correspondence constantly examined. U.S. lawmakers, we would hope, understand that individuals have the
right to a private conversation without the government looking over their shoulder. So it's dismaying to see a group of U.S. Senators attempting for a third time to pass
the EARN IT Act (S. 1207)--a law that could lead to suspicionless scans of every online message, photo, and hosted file. In the
name of fighting crime, the EARN IT Act treats all internet users like we should be in a permanent criminal lineup, under suspicion for child abuse. What The New "EARN IT" Does The EARN IT
Act creates an unelected government commission, stacks it with law enforcement personnel, and then tasks it with creating "best practices" for running an internet website or app. The act then removes nearly 30-year-old legal protections for
users and website owners, allowing state legislatures to encourage civil lawsuits and prosecutions against those who don't follow the government's "best practices." As long as they somehow tie changes in law to child
sexual abuse, state lawmakers will be able to avoid longstanding legal protections, and pass new rules that allow for criminal prosecutions and civil lawsuits against websites that don't give police special access to user messages and photos. Websites
and apps that use end-to-end encryption to protect user privacy will be pressured to remove or compromise the security of their services, or they'll face prosecutions and lawsuits. If EARN IT passes, we're likely to see state
lawmakers step in and mandate scanning of messages and other files similar to the plan that Apple wisely walked away from
last year. There's no doubt the sponsors intend this bill to scan user messages, photos, and files, and they wrote it with that goal in mind. They even suggested specific scanning software that could be used on users in a
document published last year. The bill also makes specific allowances to allow the use of encryption to constitute evidence in court against
service providers. Bill Language Purporting To Protect Encryption Doesn't Do The Job Under pressure, the bill sponsors did add language that purports to protect encryption. But once you take a closer
look, it's a shell game. The bill clearly leaves room to impose forms of "client-side scanning," which is a method of violating user privacy by sending data to law enforcement straight from user devices, before a message is encrypted. EFF has
long held that client-side scanning violates the privacy promise of end-to-end encryption , even though it
allows the encryption process to proceed in a narrow, limited sense. A 2021 paper by 10 leading technologists held that client-side scanners are a danger to democracy, amounting to "
bugs in our pockets ." The Chat-Scanning Software Being Pushed By This Bill Doesn't Work But the available evidence
shows that scanning software that looks for Child Sexual Abuse Material, or CSAM, is far from perfect. Creators of scanning software say they can't be fully audited, for legal and ethical reasons. But here's the evidence so far:
Last year, a New York Times story showed how Google's CSAM scanners
falsely accused two fathers of sending child pornography . Even after the dads were explicitly cleared by
police, Google kept their accounts shut down. Data being sent to cops by the U.S. National Center for Missing and Exploited Children (NCMEC)--the government agency that will be tasked with analyzing vastly more user data if
EARN IT passes--is far from accurate. In 2020, the Irish police received 4,192 reports from NCMEC. Of those, only 852 (20.3%) were
confirmed as actual CSAM . Only 9.7% of the reports were deemed to be "actionable." A Facebook study found that
75% of the messages flagged by its scanning system to detect child abuse material were not
"malicious," and included messages like bad jokes and memes. LinkedIn reported 75 cases of suspected CSAM to EU authorities in 2021. After manual review,
only 31 of those cases --about 41%--involved confirmed CSAM.
The idea of subjecting millions of people to false accusations of child abuse is horrific. NCMEC will export those false accusations to vulnerable communities around the world, where they can be wielded by police forces that have even
less accountability than law enforcement in the United States. False accusations are a price that EARN IT supporters seem willing to pay. We need your support to stop the EARN IT Act one more time. Digital rights supporters sent
more than 200,000 messages to Congress to kill earlier versions of this bill. We've beaten it twice before, and we can do it again. There are currently dangerous proposals that could mandate client-side scanning schemes in the
U.K. and
European Union , as well. But we don't need to resign ourselves to a world of constant surveillance. In democratic
nations, supporters of a free, secure, and private internet can win--if we speak up now.
|
|
The police will continue to abuse their power by databasing 'wrong think' opinions as if they were hate crimes
|
|
|
| 17th April 2023
|
|
| See article from reclaimthenet.org
|
The state has set up a databasing system to record criminal transgressions of people in Britain. But the police have taken it upon themselves to unilaterally use this system to record non criminal incidents where people have been accused of transgressing
against woke censorship rules eg by criticising transgender dogma. Worse still, these transgressions are recorded merely on claims by the easily offended and are not necessarily investigated by the police to ensure voracity. Hence the claims can easily
be used by people to settle scores or further grudges. Such unverified complaints turn up on official records checks when people are vetted for a job such as teaching. The courts have criticised the police for the recording of these non-crime hate
incidents (NCHI) as unlawful, and the government has also chipped in with guidelines for NCHIs to try and prevent abuse. However the police are having non of it, and are continuing on with their use more or less ignoring the court and government
criticism. The College of Policing is a taxpayer-funded quango that provides national advice to forces. In response to the court and government criticism it has been required to update its own manual for officers on how to record NCHIs, in a document
called authorised professional practice (APP). But it has been accused of deploying an "Orwellian" and "woke spin" it has decided to ignore ignore government instructions in its new draft. In the Home Office code, there are 11
scenarios provided where officers should or should not record an NCHI, 63% of which advise explicitly not to record one. However, the college's new guidance has only eight scenarios, all different to the Home Office ones, just 12.5% of which advise
explicitly not to record. Seven out of eight of these were in the college's old guidance. This was found in 2021 to be unlawful in the Court of Appeal and to disproportionately interfere with free expression in its section on how police should record
incidents. It followed a High Court victory for Harry Miller, a former constable who successfully sued after Humberside Police officers visited his workplace and recorded an NCHI because of a "transphobic" limerick he shared on Twitter. Miller, founder of Fair Cop, a group scrutinising police political correctness, told The Telegraph:
The police will not be schooled in the Home Office guidance once the APP comes out, they will be schooled in the guidance given by the College of Policing, which will mean we are exactly where we were before.
The College of Policing has taken an overtly political stance. The Home Office's examples were all very sensible and corrected the previous mistakes, but they will once again be shelved for the approved ideology of the College of
Policing.
Sir John Hayes, chairman of the Common Sense Group of 60 Tory MPs, said the College of Policing has long been a cause for public concern and called to clear out some of the bad apples that are there before they affect the
whole of policing's reputation: Policing only works by consent and the nonsense we hear from the College of Policing risks that consent, so this is another example of politically correct nonsense, perpetuating the
appalling practice of arresting people for what they believe and think. To say it is Orwellian is an understatement; I think George Orwell will be spinning in his grave.
|
|
|