Revised Sept 3rd: Apple announced that "we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features".
Apple recently announced their intention to launch a photo scanning feature into their iCloud and iMessage services. The goal of this is to identify child sexual abuse material (CSAM) being stored in iCloud and to prevent children from sharing sexually explicit photos via Apple iMessage.
Privacy and security have consistently been major marketing points for Apple, and iMessage has end-to-end encryption. The announcement of the new photo scanning functionality highlights the possible limitations of end-to-end encryption solutions.
How Apple’s New Photo Scanning Will Work
The goal of Apple’s new photo scanning solution is to restrict CSAM, which is a worthy goal. However, doing so creates significant privacy and security concerns. Apple’s photo scanning capabilities will be built into the iCloud and iMessage applications themselves. Before a photo is uploaded to iCloud Photos it will be checked against the National Center for Missing & Exploited Children’s (NCMEC) database of known CSAM content.
Additionally, any messages belonging to child accounts (i.e. listed as owned by a minor) will have all iMessages scanned for sexually explicit content. If anything is found, the parent will receive a notification (which can be disabled) and the photo will be blurred out on the child’s phone.
While this is a big deal, it is really only an extension of Apple’s existing CSAM scanning activities. Apple’s iCloud Mail is unencrypted, and the company has allegedly been scanning email attachments for CSAM since 2019. As The Telegraph reported in 2020, Apple announced that they are “utilising some technologies to help screen for child sexual abuse material” and “any accounts we find with this material will be disabled”, although no specifics were given. Expanding this to iCloud uploads and iMessages is just an expansion of these efforts.
Photo Scanning and the Limits of End-to-End Encryption
Apple’s plan to inspect the content of iMessages for sexually explicit material doesn’t seem to fit with the fact that iMessages are end-to-end encrypted. However, Apple doesn’t need to break its end-to-end encryption to accomplish this because it exploits one of the limitations of end-to-end encryption.
End-to-end encryption is designed to protect data in transit. Before being transmitted, the message is encrypted by the sender, and it is decrypted on the recipient’s device, enabling them to read its contents.
With end-to-end encryption, the messaging application has access to the unencrypted message either before transmission or after receipt. Apple is taking advantage of this in its photo scanning solution by inspecting iMessages and iCloud uploads within the app itself.
This approach to content monitoring doesn’t break end-to-end encryption because end-to-end encryption is not designed to protect data on the device itself. However, the workaround does allow Apple to correctly state that their iMessages are end-to-end encrypted, which users may incorrectly interpret as Apple having no access to their contents.
Achieving Truly Secure Messaging
Apple is adding a photo scanning feature to its iMessage and iCloud apps for a noble purpose, but it still poses security risks. While Apple plans to limit the functionality of this scanner to looking for CSAM content, this could change based on company policy or exploitation of Apple’s systems or applications.
Secure messaging applications offering end-to-end encryption should not try to undermine their security with backdoors or workarounds. Ciphr’s apps do not have any built-in content scanning and encrypt all data at rest, providing a truly secure messaging service.