Apple has announced it will provide fully encrypted iCloud backups, meeting a longstanding demand by EFF and other privacy-focused organizations.
We applaud Apple for listening to experts, child advocates, and users who want to
protect their most sensitive data. Encryption is one of the most important tools we have for maintaining privacy and security online. That's why we included the demand that Apple let users encrypt iCloud backups in the Fix It Already campaign that we
launched in 2019.
Apple's on-device encryption is strong, but some especially sensitive iCloud data, such as photos and backups, has continued to be vulnerable to government demands and hackers. Users who opt in to Apple's new
proposed feature, which the company calls Advanced Data Protection for iCloud , will be protected even if there is a data breach in the cloud, a government demand, or a breach from within Apple (such as a rogue employee). Apple said today that the
feature will be available to U.S. users by the end of the year, and will roll out to the rest of the world in "early 2023."
We're also pleased to hear that Apple has officially dropped its plans to install photo-scanning
software on its devices , which would have inspected users' private photos in iCloud and iMessage. This software, a version of what's called "client-side scanning," was intended to locate child abuse imagery and report it to authorities. When a
user's information is end-to-end encrypted and there is no device scanning, the user has true control over who has access to that data.
Apple's image-scanning plans were announced in 2021 , but delayed after EFF supporters
protested and delivered a petition containing more than 60,000 signatures to Apple executives. While Apple quietly postponed these scanning plans later that year, today's announcement makes it official.
In a statement distributed
to Wired and other journalists, Apple said:
We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through
personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.
The company has said it will focus instead on "opt-in tools for parents" and "privacy-preserving solutions to combat Child Sexual Abuse Material and protect children, while addressing the unique privacy needs of
personal communications and data storage."
Constant scanning for child abuse images can lead to unwarranted investigations and false positives. Earlier this year, the New York Times reported on how faulty scans at Google led
to false accusations of child abuse against fathers in Texas and California. The men were exonerated by police but were subjected to permanent account deletion by Google.
Companies should stop trying to square the circle by
putting bugs in our pockets at the request of governments, and focus on protecting their users, and human rights. Today Apple took a big step forward on both fronts. There are a number of implementation choices that can affect the overall security of the
new feature, and we'll be pushing Apple to make sure the encryption is as strong as possible. Finally, we'd like Apple to go a step further. Turning on these privacy-protective features by default would mean that all users can have their rights
protected.