|
|
|
|
| 16th December 2023
|
|
|
Linking encryption so closely to the protection of children suggests the plans to raise the minimum age at which users can access social networks is a response to companies' defiance over encrypted messages See
article from theguardian.com |
|
The Online 'Safety' Bill is not the only threat to British people's internet privacy and security
|
|
|
|
20th July 2023
|
|
| See article from bbc.co.uk
|
Apple says it will remove services such as FaceTime and iMessage from the UK rather than weaken security if new UK government proposals are made law and acted upon. The government is seeking to update the Investigatory Powers Act (IPA) 2016. It wants
messaging services to clear security features with the Home Office before releasing them to customers. The act lets the Home Office demand security features are disabled, without telling the public. Under the update, this would have to be immediate.
Currently, there has to be a review, there can also be an independent oversight process and a technology company can appeal before taking any action. WhatsApp and Signal are among the platforms to have opposed a clause in the Online Safety Bill
allowing the communications regulator to require companies to install technology to scan for child-abuse material in encrypted messaging apps and other services. The government has opened an eight-week consultation on the proposed amendments to
the IPA. , which already enables the storage of internet browsing records for 12 months and authorises the bulk collection of personal data. Apple has made a 9 page submission to the current consultation opposing the snooping proposal:
It would not make changes to security features specifically for one country that would weaken a product for all users. Some changes would require issuing a software update so could not be made secretly The proposals
constitute a serious and direct threat to data security and information privacy that would affect people outside the UK.
|
|
Civil society organisations urge UK to protect global digital security and safeguard private communication
|
|
|
| 28th June 2023
|
|
| See article from openrightsgroup.org
|
To: Chloe Smith, Secretary of State, Department for Science, Innovation and Technology cc: Tom Tugendhat, Minister of State for Security, Home Office Paul Scully, Minister for Tech and the Digital Economy Lord Parkinson of Whitley Bay
Dear Ms Smith, We are over 80 national and international civil society organisations, academics and cyberexperts. We represent a wide range of perspectives including digital human rights and technology.
We are writing to you to raise our concerns about the serious threat to the security of private and encrypted messaging posed by the UK's proposed Online Safety Bill (OSB). The Online Safety Bill is a
deeply troubling legislative proposal. If passed in its present form, the UK could become the first liberal democracy to require the routine scanning of people's private chat messages, including chats that are secured by end-to-end encryption. As over 40
million UK citizens and 2 billion people worldwide rely on these services, this poses a significant risk to the security of digital communication services not only in the UK, but also internationally. End-to-end encryption ensures
the security of communications for everyone on a network. It is designed so that no-one, including the platform provider, can read or alter the messages. The confidentiality between sender and recipient is completely preserved. That's why the United
Nations, several human rights groups, and anti-human trafficking organisations alike have emphasised that encryption is a vital human rights tool. In order to comply with the Online Safety Bill, platform providers would have to
break that protection either by removing it or by developing work-arounds. Any form of work-around risks compromising the security of the messaging platform, creating back-doors, and other dangerous ways and means for malicious actors and hostile states
to corrupt the system. This would put all users in danger. The UK government has indicated its intention for providers to use a technology that would scan chats on people's phone and devices -- known as client-side scanning. The
UK government's assertion that client-side scanning will not compromise the privacy of messages contradicts the significant evidence of cyber-security experts around the world. This software intercepts chat messages before they are encrypted, and as the
user is uploading their images or text, and therefore confidentiality of messages cannot be guaranteed. It would most likely breach human rights law in the UK and internationally. Serious concerns have also been raised about
similar provisions in the EU's proposed Child Sexual Abuse Regulation, which an independent expert study warns is in contradiction to human rights rules. French, Irish and Austrian parliamentarians have all also warned of severe threats to human rights
and of undermining encryption. Moreover, the scanning software would have to be pre-installed on people's phones, without their permission or full awareness of the severe privacy and security implications. The underlying databases
can be corrupted by hostile actors, meaning that individual phones would become vulnerable to attack. The breadth of the measures proposed in the Online Safety Bill -- which would infringe the rights to privacy to the same extent for the internet's
majority of legitimate law-abiding users as it would for potential criminals -- means that the measures cannot be considered either necessary or proportionate. The inconvenient truth is that it is not possible to scan messages for
bad things without infringing on the privacy of lawful messages. It is not possible to create a backdoor that only works for good people and that cannot be exploited by bad people. Privacy and free expression rights are vital for
all citizens everywhere, in every country, to do their jobs, raise their voices, and hold power to account without arbitrary intrusion, persecution or repression. End-to-end encryption provides vital security that allows them to do that without arbitrary
interference. People in conflict zones who rely on secure encrypted communications to be able to speak safely to friends and family as well as for national security. Journalists around the world who rely on the confidential channels of encrypted chat,
can communicate to sources and upload their stories in safety. Children, too, need these rights, as emphasised by UNICEF based on the UN Convention of the Rights of the Child. Child safety and privacy are not mutually exclusive;
they are mutually reinforcing. Indeed, children are less safe without encrypted communications, as they equally rely on secure digital experiences free from their data being harvested or conversations intercepted. Online content scanning alone cannot
hope to fish out the serious cases of exploitation, which require a whole-of-society approach. The UK government must invest in education, judicial reform, social services, law enforcement and other critical resources to prevent abuse before it can reach
the point of online dissemination, thereby prioritising harm prevention over retrospective scanning. As an international community, we are deeply concerned that the UK will become the weak link in the global system. The security
risk will not be confined within UK borders. It is difficult to envisage how such a destructive step for the security of billions of users could be justified. The UK Prime Minister, Rishi Sunak, has said that the UK will maintain
freedom, peace and security around the world. With that in mind, we urge you to ensure that end-to-end encrypted services will be removed from the scope of the Bill and that the privacy of people's confidential communications will be upheld.
Signed, Access Now, ARTICLE 19: Global Campaign for Free Expression, Asociatia pentru Tehnologie Ui Internet (ApTI), Associação Portuguesa para a Promoção da Segurança da Informação (AP2SI), Association for
Progressive Communications (APC), Big Brother Watch, Centre for Democracy and Technology, Chaos Computer Club (CCC), Citizen D / Drzavljan D, Collaboration on International ICT Policy for East and Southern Africa (CIPESA), Community NeHUBs Africa,
cyberstorm.mu, Defend Digital Me, CASM at Demos, Digitalcourage, Digitale Gesellschaft, DNS Africa Media and Communications, Electronic Frontier Finland, Electronic Frontier Foundation (EFF), Electronic Frontier Norway, Epicenter.works, European Center
for Not-for-Profit Law, European Digital Rights (EDRi), European Sex Workers Rights Association (ESWA), Fair Vote, Fight for the Future, Foundation for Information Policy Research, Fundación Cibervoluntarios, Global Partners Digital, Granitt, Hermes
Center for Transparency and Digital Human Rights, Homo Digitalis, Ikigai Innovation Initiative, Internet Society, Interpeer gUG, ISOC Brazil -- Brazilian Chapter of the Internet Society, ISOC Ghana, ISOC India Hyderabad Chapter, ISOC Venezuela, IT-Pol,
JCA-Net (Japan), Kijiji Yeetu, La Quadrature du Net, Liberty, McEvedys Solicitors and Attorneys Ltd, Open Rights Group, OpenMedia, OPTF, Privacy and Access Council of Canada, Privacy International, Ranking Digital Rights, Statewatch, SUPERRR Lab, Tech
for Good Asia, UBUNTEAM, Wikimedia Foundation, Wikimedia UK Professor Paul Bernal, Nicholas Bohm, Dr Duncan Campbell, Alan Cox, Ray Corrigan, Professor Angela Daly, Dr Erin Ferguson, Wendy M. Grossman, Dr Edina Harbinja, Dr Julian
Huppert, Steve Karmeinsky, Dr Konstantinos Komaitis, Professor Douwe Korff, Petr Kucera, Mark A. Lane, Christian de Larrinaga, Mark Lizar, Dr Brenda McPhail, Alec Muffett, Riana Pferfferkorn, Simon Phipps, Dr Birgit Schippers, Peter Wells, Professor Alan
Woodward
|
|
WhatsApp would rather be blocked in Britain rather than submit to UK demands for encryption backdoors
|
|
|
|
31st July 2022
|
|
| See article from bbc.co.uk |
The boss of WhatsApp says it will not lower the security of its messenger service. Will Cathcart told the BBC. If asked by the government to weaken encryption, it would be very foolish to accept. We
continue to work with the tech sector to support the development of innovative technologies that protect public safety without compromising on privacy. End-to-end encryption (E2EE) provides the most robust level of security, because - by
design - only the intended recipient holds the key to decrypt the message, which is essential for private communication. The technology underpins the online exchanges on apps including WhatsApp and Signal and - optionally - on Facebook messenger and
Telegram. Only the sender and receiver can read those messages - not law enforcement or the technology giants. The UK government wants phone software to scan people's phones for banned material prior to being encrypted for a message. Cathcart explained:
Client-side scanning cannot work in practice. Because millions of people use WhatsApp to communicate across the world, it needs to maintain the same standards of privacy across every country. If
we had to lower security for the world, to accommodate the requirement in one country, that...would be very foolish for us to accept, making our product less desirable to 98% of our users because of the requirements from 2%. What's being proposed is that we - either directly or indirectly through software - read everyone's messages. I don't think people want that.
Ella Jakubowska, policy adviser at campaign group European Digital Rights, said: Client-side scanning is almost like putting spyware on every person's phone. It also creates a backdoor for malicious actors
to have a way in to be able to see your messages.
|
|
|
|
|
| 22nd July 2022
|
|
|
GCHQ boss calls for snooping into people's phones as a backdoor to strong encryption See article from theregister.com
|
|
The Home Office sets up propaganda website opposing end to end encryption whilst pretending it to be a grass roots campaign group
|
|
|
| 18th January 2022
|
|
| See article from rollingstone.com
See also noplacetohide.org.uk
|
The UK Government believes that British people should sacrifice protection against internet scammers, spammers and thieves in the name of being able to scan people's messages looking for child porn. Perhaps a little like unacceptably asking people
not to use door locks so that the police can always drop in to people's homes to check for child abuse. Now it seems that the government is going to extremes to forward their beliefs by setting up a fake campaign website to pretend that people are
calling for the removal of their basic internet security of end to end encryption used in several messaging apps. Rollin Stone magazine has revealed: The Home Office has hired the M&C Saatchi advertising
agency -- a spin-off of Saatchi and Saatchi, which made the Labour Isn't Working election posters, among the most famous in UK political history -- to plan the campaign, using public funds. A Home Office spokesperson said in a
statement. We have engaged M&C Saatchi to bring together the many organisations who share our concerns about the impact end-to-end encryption would have on our ability to keep children safe/
In response to a Freedom of Information request about an upcoming ad campaign directed at Facebook's end-to-end encryption proposal, The Home Office disclosed that, Under current plans, £534,000 is allocated for this campaign.
Offsite Comment: Why we need End To End Encryption ...And why it's essential for our safety, our children's safety, and for everyone's future 18th January 2022. See
article from alecmuffett.com by Alec Muffett |
|
Open Rights Group explains how the Online 'Safety' Bill will endanger internet users
|
|
|
| 11th December 2021
|
|
| See Creative Commons article from openrightsgroup.org by Robert Sharp
|
Of the many worrying provisions contained within the draft Online Safety Bill, perhaps the most consequential is contained within Chapter 4, at clauses 63-69. This section of the Bill hands OFCOM the power to issue "Use of Technology Notices"
to search engines and social media companies. As worded, the powers will lead to the introduction of routine and perpetual surveillance of our online communications. They also threaten to fatally undermine the use of end-to-end encryption, one of the
fundamental building blocks of digital technology and commerce. Use of Technology Notices purport to tackle terrorist propaganda and Child Sexual Exploitation and Abuse (CSEA) content. OFCOM will issue a Notice based on the
"prevalence" and "persistent presence" of such illegal content on a service. The phrases "perpetual" and "persistent" recur throughout the Bill but remain undefined, so the threshold for interference could be quite
low. Any company that receives a Notice will be forced to use certain "accredited technologies" to identify terrorist and CSEA content on the platform. The phrase "accredited
technologies" is wide-ranging. The Online Safety Bill defines it as technology that meets a "minimum standard" for successfully identifying illegal content, although it is currently unclear what that minimum standard may be.
The definition is silent on what techniques an accredited technology might deploy to achieve the minimum standard. So it could take the form of an AI that classifies images and text. Or it may be a system that compares all the content
uploaded to the hashes of known CSEA images logged on the Home Office's Child Abuse Image Database (CAID) and other such collections. Whatever the precise technique used, identifying terrorist or CSEA content must involve
scanning each user's content as it is posted, or soon after. Content that a bot decides is related to terrorism or child abuse will be flagged and removed immediately. Social media services are public platforms, and so it cannot
be said that scanning the content we post to our timelines amounts to an invasion of privacy -- even when we post to a locked account or a closed group, we are still "publishing" to someone. Indeed, search engines have been scanning our content
(albeit at their own pace) for many years, and YouTube users will be familiar with the way the platform recognises and monetises any copyrighted content. It is nevertheless disconcerting to know that an automated pre-publication
censor will examine everything we publish. It will chill freedom of expression in itself, and also lead to unnecessary automated takedowns when the system makes a mistake. Social media users routinely experience the problem of over-zealous bots causing
the removal of public domain content, which impinges on free speech and damages livelihoods. However, the greater worry is that these measures will not be limited to content posted only to public (or semi-public) feeds. The
Interpretation section of the Bill (clause 137) defines "content" as "anything communicated by means of an internet service, whether publicly or privately ..."(emphasis added). So the Use of Technology Notices will apply to
direct messaging services too . This power presents two significant threats to civil liberties and digital rights. The first is that once an "accredited technology" is deployed on a platform,
it need not be limited to checking only for terrorism or child porn. Other criminal activity may eventually be added to the list through a simple amendment to the relevant section of the Act, ratcheting up the extent of the surveillance.
Meanwhile, other Governments around the world will take inspiration from OFCOM's powers to implement their own scanning regime, perhaps demanding that the social media companies scan for blasphemous, seditious, immoral or dissident
content instead. The second major threat is that the "accredited technologies" will necessarily undermine end-to-end encryption. If the tech companies are to scan all our content, then they have to be able to see it first. This demand, which the government overtly states as its goal , is incompatible with the concept of end-to-end encryption. Either such encryption will be disabled, or the technology companies will create some kind of "back door" that will leave those users vulnerable, to fraud, scams, and invasions of privacy.
Predictable examples include identity theft, credit card theft, mortgage deposit theft and theft of private messages and images. As victims of these crimes tell us, such thefts can lead to severe emotional distress and even
contemplation of suicide -- precisely the 'harm' that the Online Safety Bill purports to prevent. The trade-off, therefore, is not between privacy (or free speech) and security. Instead, it is a tension between two different types
of online security: the 'negative' security to not experience harmful content online; and the 'positive' security of ensuring that our sensitive personal and corporate data is not exposed to those who would abuse it (and us). As
Cairan Martin, the former head of the National Cyber Security Centre said in November 2021 , "cyber security is a public good ... it is increasingly hard to think of instances where the benefit of weakening digital security outweighs the benefits of
keeping the broad majority of the population as safe as possible online as often as possible. There is nothing to be gained in doing anything that will undermine user trust in their own privacy and security." A fundamental
principle of human rights law is that any encroachment on our rights must be necessary and proportionate. And as ORG's challenge to GCHQ's surveillance practices in Big Brother Watch v UK demonstrated, treating the entire population as a suspect
whose communications must be scanned is neither a necessary nor proportionate way to tackle the problem. Nor is it proportionate to dispense with a general right to data security, only to achieve a marginal gain in the fight against illegal content.
While terrorism and CSEA are genuine threats, they cannot be dealt with by permanently dispensing with everyone's privacy. Open Rights Group recommends
Removing the provisions for Use of technology Notices from the draft Online Safety Bill If these provisions remain, Use of Technology Notices should only apply to public messages. The wording of
clauses 64(4)(a) and (b) should be amended accordingly.
|
|
UK government funds development of methods to snoop on photos on your device
|
|
|
| 16th
November 2021
|
|
| See press release from gov.uk
|
The UK government has announced that it is funding five projects to snoop on your device content supposedly in a quest to seek out child porn. But surely these technologies will have wider usage. The five projects are the winners of the Safety Tech
Challenge Fund, which aims to encourage the tech industry to find practical solutions to combat child sexual exploitation and abuse online, without impacting people's rights to privacy and data protection in their communications. The winners will
each receive an initial £85,000 from the Fund, which is administered by the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office, to help them bring their technical proposals for new digital tools and applications to combat online
child abuse to the market. Based across the UK and Europe, and in partnership with leading UK universities, the winners of the Safety Tech Challenge Fund are:
- Edinburgh-based Cyan Forensics and Crisp Thinking, in partnership with the University of Edinburgh and Internet Watch Foundation, will develop a plug-in to be integrated within encrypted social platforms. It will detect child sexual abuse material
(CSAM) - by matching content against known illegal material.
- SafeToNet and Anglia Ruskin University will develop a suite of live video-moderation AI technologies that can run on any smart device to prevent the filming of nudity, violence,
pornography and CSAM in real-time, as it is being produced.
- GalaxKey, based in St Albans, will work with Poole-based Image Analyser and Yoti, an age-assurance company, to develop software focusing on user privacy, detection and prevention of
CSAM and predatory behavior, and age verification to detect child sexual abuse before it reaches an E2EE environment, preventing it from being uploaded and shared.
- DragonflAI, based in Edinburgh, will also work with Yoti to combine their
on-device nudity AI detection technology with age assurance technologies to spot new indecent images within E2EE environments.
- T3K-Forensics are based in Austria and will work to implement their AI-based child sexual abuse detection technology
on smartphones to detect newly created material, providing a toolkit that social platforms can integrate with their E2EE services.
|
|
|
|
|
| 10th July 2021
|
|
|
Will Cathcart likens governments' stance to insisting a 1984 telescreen be installed in every living room See
article from theguardian.com |
|
|
|
|
| 24th April 2021
|
|
|
The disgraceful NSPCC is lobbying government to deny internet users their basic security against hackers, scammers, black mailers and thieves See
article from bazzacollins.medium.com |
|
|
|
|
|
4th April 2021
|
|
|
Wired has reported that the Home Office is actively exploring legal and technical mechanisms to compel Facebook and WhatsApp to break end-to-end encrypted messaging See
article from openrightsgroup.org |
|
The Chinese will be probing your backdoors as soon as they are introduced
|
|
|
| 8th March 2020
|
|
| See article from
telegraph.co.uk |
| Haha he thought he was protected by a level 5 lock spell, but every bobby on the street has a level 6 unlock spell, and the bad guys have level 10. |
The Government is playing silly games trying to suggest ways that snooping backdoors on people's encrypted messaging could be unlocked by the authorities whilst being magically safe from bad guys especially those armed with banks of Chinese super
computers. The government wants to make backdoors mandatory for messaging and offers a worthless 'promise' that authority figures would need to agree before the police are allowed to use their key to unlock messages. Andersen Cheng, chief
executive of Post-Quantum, a specialist encryption firm working with Nato and Government agencies, said a virtual key split into five parts - or more - could unlock messages when all five parties agreed and the five key fragments were joined together.
Those five parties could include the tech firm like Facebook, police, security service or GCHQ, an independent privacy advocate or specialist similar to the independent reviewer of terror legislation and the judge authorising the warrant. Cheng's first company TRL helped set up the secure communications system used by 10 Downing Street to talk with GCHQ, embassies and world leaders, but I bet that system did not include backdoor keys.
The government claims that official access would only be granted where, for example, the police or security service were seeking to investigate communications between suspect parties at a specific time, and where a court ruled it was in the public or
nation's interest. However the government does not address the obvious issue of bad guys getting hold of the keys and letting anyone unlock the messages for a suitable fee. And sometimes those bad guys are armed with the best brute force code cracking
powers in the world. |
|
The government initiates a data sharing agreement with the US and takes the opportunity to complain about internet encryption that keeps us safe from snoops, crooks, scammers and thieves
|
|
|
| 5th October 2019
|
|
| 4th October 2019. See government press release from gov.uk
|
Home Secretary Priti Patel has signed an historic agreement that will enable British law enforcement agencies to directly demand electronic data relating to terrorists, child sexual abusers and other serious criminals from US tech firms.
The world-first UK-US Bilateral Data Access Agreement will dramatically speed up investigations and prosecutions by enabling law enforcement, with appropriate authorisation, to go directly to the tech companies to access data, rather
than through governments, which can take years. The Agreement was signed with US Attorney General William P. Barr in Washington DC, where the Home Secretary also met security partners to discuss the two countries' ever deeper
cooperation and global leadership on security. The current process, which see requests for communications data from law enforcement agencies submitted and approved by central governments via Mutual Legal Assistance (MLA), can
often take anywhere from six months to two years. Once in place, the Agreement will see the process reduced to a matter of weeks or even days. The US will have reciprocal access, under a US court order, to data from UK
communication service providers. The UK has obtained assurances which are in line with the government's continued opposition to the death penalty in all circumstances. Any request for data must be made under an authorisation in
accordance with the legislation of the country making the request and will be subject to independent oversight or review by a court, judge, magistrate or other independent authority. The Agreement does not change anything about
the way companies can use encryption and does not stop companies from encrypting data. It gives effect to the Crime (Overseas Production Orders) Act 2019, which received Royal Assent in February this year and was facilitated by
the CLOUD Act in America, passed last year. Letter to Mark Zuckerberg asking him not to keep his internet users safe through encrypted messages The Home Secretary has also published an open letter to
Facebook, co-signed with US Attorney General William P. Barr, Acting US Homeland Security Secretary Kevin McAleenan and Australia's Minister for Home Affairs Peter Dutton, outlining serious concerns with the company's plans to implement end-to-end
encryption across its messaging services. The letter reads: Dear Mr. Zuckerberg,
We are writing to request that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services without ensuring that there is no reduction to user safety and without including a means for lawful access to the
content of communications to protect our citizens. In your post of 6 March 2019, 'A Privacy-Focused Vision for Social Networking', you acknowledged that "there are real safety concerns to address before we can implement
end-to-end encryption across all our messaging services." You stated that "we have a responsibility to work with law enforcement and to help prevent" the use of Facebook for things like child sexual exploitation, terrorism, and extortion.
We welcome this commitment to consultation. As you know, our governments have engaged with Facebook on this issue, and some of us have written to you to express our views. Unfortunately, Facebook has not committed to address our serious concerns about
the impact its proposals could have on protecting our most vulnerable citizens. We support strong encryption, which is used by billions of people every day for services such as banking, commerce, and communications. We also
respect promises made by technology companies to protect users' data. Law abiding citizens have a legitimate expectation that their privacy will be protected. However, as your March blog post recognized, we must ensure that technology companies protect
their users and others affected by their users' online activities. Security enhancements to the virtual world should not make us more vulnerable in the physical world. We must find a way to balance the need to secure data with public safety and the need
for law enforcement to access the information they need to safeguard the public, investigate crimes, and prevent future criminal activity. Not doing so hinders our law enforcement agencies' ability to stop criminals and abusers in their tracks.
Companies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes. This puts our citizens and societies at risk by severely eroding a
company's ability to detect and respond to illegal content and activity, such as child sexual exploitation and abuse, terrorism, and foreign adversaries' attempts to undermine democratic values and institutions, preventing the prosecution of offenders
and safeguarding of victims. It also impedes law enforcement's ability to investigate these and other serious crimes. Risks to public safety from Facebook's proposals are exacerbated in the context of a single platform that would
combine inaccessible messaging services with open profiles, providing unique routes for prospective offenders to identify and groom our children. Facebook currently undertakes significant work to identify and tackle the most
serious illegal content and activity by enforcing your community standards. In 2018, Facebook made 16.8 million reports to the US National Center for Missing & Exploited Children (NCMEC) -- more than 90% of the 18.4 million total reports that year.
As well as child abuse imagery, these referrals include more than 8,000 reports related to attempts by offenders to meet children online and groom or entice them into sharing indecent imagery or meeting in real life. The UK National Crime Agency (NCA)
estimates that, last year, NCMEC reporting from Facebook will have resulted in more than 2,500 arrests by UK law enforcement and almost 3,000 children safeguarded in the UK. Your transparency reports show that Facebook also acted against 26 million
pieces of terrorist content between October 2017 and March 2019. More than 99% of the content Facebook takes action against -- both for child sexual exploitation and terrorism -- is identified by your safety systems, rather than by reports from users.
While these statistics are remarkable, mere numbers cannot capture the significance of the harm to children. To take one example, Facebook sent a priority report to NCMEC, having identified a child who had sent self-produced child
sexual abuse material to an adult male. Facebook located multiple chats between the two that indicated historical and ongoing sexual abuse. When investigators were able to locate and interview the child, she reported that the adult had sexually abused
her hundreds of times over the course of four years, starting when she was 11. He also regularly demanded that she send him sexually explicit imagery of herself. The offender, who had held a position of trust with the child, was sentenced to 18 years in
prison. Without the information from Facebook, abuse of this girl might be continuing to this day. Our understanding is that much of this activity, which is critical to protecting children and fighting terrorism, will no longer be
possible if Facebook implements its proposals as planned. NCMEC estimates that 70% of Facebook's reporting -- 12 million reports globally -- would be lost. This would significantly increase the risk of child sexual exploitation or other serious harms.
You have said yourself that "we face an inherent tradeoff because we will never find all of the potential harm we do today when our security systems can see the messages themselves". While this trade-off has not been quantified, we are very
concerned that the right balance is not being struck, which would make your platform an unsafe space, including for children. Equally important to Facebook's own work to act against illegal activity, law enforcement rely on
obtaining the content of communications, under appropriate legal authorisation, to save lives, enable criminals to be brought to justice, and exonerate the innocent. We therefore call on Facebook and other companies to take the
following steps:
embed the safety of the public in system designs, thereby enabling you to continue to act against illegal content effectively with no reduction to safety, and facilitating the prosecution of offenders and safeguarding of victims
enable law enforcement to obtain lawful access to content in a readable and usable format engage in consultation with governments to facilitate this in a way that is substantive and genuinely
influences your design decisions not implement the proposed changes until you can ensure that the systems you would apply to maintain the safety of your users are fully tested and operational
We are committed to working with you to focus on reasonable proposals that will allow Facebook and our governments to protect your users and the public, while protecting their privacy. Our technical experts are confident that we can
do so while defending cyber security and supporting technological innovation. We will take an open and balanced approach in line with the joint statement of principles signed by the governments of the US, UK, Australia, New Zealand, and Canada in August
2018 and the subsequent communique agreed in July this year . As you have recognised, it is critical to get this right for the future of the internet. Children's safety and law enforcement's ability to bring criminals to justice
must not be the ultimate cost of Facebook taking forward these proposals. Yours sincerely, Rt Hon Priti Patel MP, United Kingdom Secretary of State for the Home Department William P.
Barr, United States Attorney General Kevin K. McAleenan, United States Secretary of Homeland Security (Acting) Hon Peter Dutton MP, Australian Minister for Home Affairs
Update: An All-Out Attack on Encryption 5th October 2019. See article from eff.org
Top law enforcement officials in the United States, United Kingdom, and Australia told Facebook today that they want backdoor access to all encrypted messages sent on all its platforms. In an
open letter , these governments called on Mark Zuckerberg to stop Facebook's
plan to introduce end-to-end encryption on all of the company's messaging products and instead promise that it
will "enable law enforcement to obtain lawful access to content in a readable and usable format." This is a staggering attempt to undermine the security and privacy of communications tools used by billions of people.
Facebook should not comply. The letter comes in concert with the signing of a new agreement between the US and UK to provide access to allow law enforcement in one jurisdiction to more easily obtain electronic data stored in the other jurisdiction. But
the letter to Facebook goes much further: law enforcement and national security agencies in these three countries are asking for nothing less than access to every conversation that crosses every digital device. The letter focuses
on the challenges of investigating the most serious crimes committed using digital tools, including child exploitation, but it ignores the severe risks that introducing encryption backdoors would create. Many people--including journalists, human rights
activists, and those at risk of abuse by intimate partners--use encryption to stay safe in the physical world as well as the online one. And encryption is central to preventing criminals and even corporations from spying on our private conversations, and
to ensure that the communications infrastructure we rely on is truly working as intended . What's more, the backdoors
into encrypted communications sought by these governments would be available not just to governments with a supposedly functional rule of law. Facebook and others would face immense pressure to also provide them to authoritarian regimes, who might seek
to spy on dissidents in the name of combatting terrorism or civil unrest, for example. The Department of Justice and its partners in the UK and Australia claim to support "strong encryption," but the unfettered access to
encrypted data described in this letter is incompatible with how encryption actually works .
|
|
|
|
|
|
1st December 2018
|
|
|
GCHQ pushes for the ability to silently join and snoop on encrypted messaging conversations See article from theregister.co.uk
|
|
'Five Eyes' governments get heavy with internet companies demanding that they get backdoors to encryption
|
|
|
| 4th September 2018
|
|
| See article from alphr.com
|
The Five Eyes governments of the UK, US, Canada, Australia and New Zealand have threatened the tech industry to voluntarily create backdoor access to their systems, or be compelled to by law if they don't. The move is a final warning to platform
holders such as WhatsApp, Apple and Google who deploy encryption to guarantee user privacy on their services. A statement by the Five Eyes governments says: Encryption is vital to the digital economy and a secure
cyberspace, and to the protection of personal, commercial and government information ...HOWEVER.. . the increasing use and sophistication of certain encryption designs present challenges for nations in combating serious crimes and threats
to national and global security. Many of the same means of encryption that are being used to protect personal, commercial and government information are also being used by criminals, including child sex offenders, terrorists and
organized crime groups to frustrate investigations and avoid detection and prosecution. If the industry does not voluntarily establish lawful access solutions to their products the statement continued, we may pursue technological,
enforcement, legislative or other measures to guarantee entry.
|
|
The authorities find a wheeze to avoid giving defendants the rights and defences provided by the law intended to deal with getting people's encryption keys
|
|
|
| 11th May 2016
|
|
| See article from bbc.com See also
article from jackofkent.com |
A unjust bid by the National Crime Agency to use civil law to force an alleged cyber hacker to hand over encrypted computer passwords has been thrown out by a judge. The agency (NCA) seized the computers during a raid at Love's home in October 2013.
the authorities tried to shortcut lega safeguards in such cases by using an obscure civil law to extract the keys. But this legal shortcut has now beem rejected by a district judge. Delivering her judgment at London's Westminster Magistrates'
Court, District Judge Nina Tempia said the NCA should have used the normal police powers rather than a civil action to obtain the information: I'm not granting the application because, to obtain the information sought,
the correct procedure to use - as the NCA did two-and-a-half years ago - is RIPA (Regulation of Investigatory Powers Act) and the inherent safeguards incorporated thereafter, she said.
She added the powers of the court should not be
used to circumnavigate existing laws and the safeguards. The US is attempting to extradite Lauri Love on charges of hacking into the US Army, Nasa and US Federal Reserve networks. |
|
In the wake of seeing the TalkTalk consequences of not keeping people's data safe and encrypted, the government seems to partially backtrack on new legislation to restrict encryption
|
|
|
| 29th October 2015
|
|
| See article from
theregister.co.uk |
The recent TalkTalk hacking seems to have taught David Cameron a lesson on how important it is to keep data safe and encrypted. The topic came yup this week in the House of Lords when Joanna Shields, minister for internet safety and security,
confirmed that the government will not pass laws to ban encryption. and that the government has no intention of introducing legislation to weaken encryption or to require back doors. The debate was brought by Liberal Democrat Paul
Strasburger, who claimed Cameron does not seem to get the need for strong encryption standards online, with no back door access. Strasburger said: [Cameron] three times said that he intends to ban any
communication 'we cannot read', which can only mean weakening encryption. Will the Minister [Shields] bring the Prime Minister up to speed with the realities of the digital world?
Liberal Democrat peer Lord Clement-Jones asked if she
could absolutely confirm that there is no intention in forthcoming legislation either to weaken encryption or provide back doors. Shields denied Cameron intended to introduce laws to weaken encryption and said:
The Prime Minister did not advocate banning encryption; he expressed concern that many companies are building end-to-end encrypted applications and services and not retaining the keys. She then seemingly
contradicted herself by adding that companies that provide end-to-end encrypted applications, such as Whatsapp, which is apparently used by the terror group calling itself Islamic State, must be subject to decryption and that information handed over
to law enforcement in extremis .
|
| |