Apple can scan your photos to fight paedophiles while protecting your privacy
The proliferation of child pornography on the Internet is terrible and sobering. Technology companies send tens of millions of reports a year of these images to the US non-profit National Center for Missing and Exploited Children.
The way the companies that store your images in the clouds typically detect these images can leave you vulnerable to privacy breaches – and to hackers breaking into their computers. On 5 August 2021, Apple announced a new method of detecting such content that promises to better protect your privacy.
As a computer scientist who studies cryptography, I can explain how Apple’s system works, why it is an improvement, and why Apple needs to do more.
Content:
Who holds the keys?
Digital files can be protected in a kind of virtual safe through encryption, which scrambles a file so that it can only be revealed, or decrypted, by someone holding a secret key. Encryption is one of the best tools for protecting personal information as it travels over the Internet.
Can a cloud service provider detect child pornography if the photos are blurred by encryption? This depends on who holds the secret key.
Many cloud providers, including Apple, keep a copy of the secret key so that they can help you recover data if you forget your password. With this key, the provider can also compare photos stored on the cloud with known child abuse images held by the National Center for Missing and Exploited Children.
But this convenience comes at a significant cost. An online service provider that stores secret keys can abuse its access to your data or fall prey to a data breach.
A better approach to online security is end-to-end encryption, where the secret key is stored only on your own computer, phone or tablet. In this case, the provider cannot decrypt your photos. Apple’s answer to the search for child pornography protected by end-to-end encryption is a new procedure in which the cloud provider, i.e. Apple, and your device perform the image comparison together.
Spotting evidence without looking at it
While it may seem like magic, modern cryptography allows you to work with data you can’t see. I’ve contributed to projects that use cryptography to measure the gender pay gap without knowing anyone’s salary, and to detect repeat sexual assault offenders without reading the victim’s report. And there are many other examples of companies and governments using cryptographically protected computing to provide services while protecting the underlying data.
Apple’s iCloud Photos image matching uses encryption-protected computing to analyse photos without seeing them. It relies on a tool called private set intersection, studied by cryptographers since the 1980s. This tool allows two people to discover the files they have in common while hiding the rest.
Here’s how image matching works. Apple distributes a database on everyone’s iPhone, iPad and Mac containing undecipherable encodings of known child abuse images. For each photo you upload to iCloud, your device applies a digital fingerprint, called NeuralHash. The fingerprint works even if someone makes small changes to a photo. Your device then creates a credential for your photo, which your device cannot understand, but which tells the server whether the uploaded photo matches any child pornography images in the database.

If a sufficient number of credentials from a device indicate matches with known child abuse images, the server learns the secret keys to decrypt all matching photos, but not the keys to other photos. Otherwise, the server cannot view any of your photos.
The fact that this matching procedure takes place on your device may be more privacy-friendly than previous methods, in which the matching takes place on a server – provided it is properly deployed. This is an important caveat.
Imagining what could go wrong
There is a line in the film Apollo 13 in which Gene Kranz, played by Ed Harris, proclaims: “I don’t care what a thing was designed to do. I care what it can do! Apple’s phone scanning technology is designed to protect privacy. Computer security and technology policy experts are trained to discover the ways in which a technology can be used, misused and abused, regardless of the intent of its creator. However, Apple’s announcement lacks information to analyse the key components, so it is not possible to assess the security of its new system.
Security researchers need to see Apple’s code to validate that the device-assisted matching software is true to design and does not introduce errors. Researchers also need to check whether it is possible to fool Apple’s NeuralHash algorithm by making imperceptible changes to a photo.
It is also important that Apple puts in place an auditing policy so that the company is held accountable for matching only child abuse images. The threat of mission creep was a risk even with server-based matching. The good news is that device matching offers new opportunities to audit Apple’s actions, as the encrypted database links Apple to a specific set of images. Apple should allow anyone to verify that they have received the same encrypted database and for third party auditors to validate the images contained in that set. These public accountability goals can be achieved by using cryptography.
Apple’s proposed image matching technology has the potential to improve digital privacy and child safety, especially if Apple follows through with end-to-end encryption in iCloud. But no single technology can fully address complex social issues. All the options for using encryption and image scanning have delicate and nuanced effects on society.
These sensitive issues require time to reason about the potential consequences of even well-intentioned actions before they are deployed, through dialogue with affected groups and researchers from a wide range of backgrounds. I invite Apple to join this dialogue so that the research community can collectively improve the safety and accountability of this new technology.