Apple and the Four Horsemen

Way back in 1998, Tim May identified what he called The Four Horsemen of the Infocalypse:

How will [internet] privacy and anonymity be attacked?
– like so many other “computer hacker” items, [by saying it’s being used] as a tool for the “Four Horsemen”: drug-dealers, money-launderers, terrorists, and pedophiles.

The Cyphernomicon

A few weeks ago, the Four Horsemen found a new ally: Apple.

The next update to Apple’s iPhone operating system, iOS15, is due for release in September. In early August, the company announced it would include a new CSAM scanning system. CSAM stands for Child Sexual Abuse Material.

In order to understand the implications of this, I’m going to have to throw another acronym at you: NCMEC (pronounced “nic-mic”). The National Centre for Missing & Exploited Children is a US organisation established by Congress in 1983. It’s not part of the American government, although it is almost entirely funded by the Department of Justice. NCMEC is unique in that it can, without liability, store CSAM material and share it with law enforcement. You might think of it as a warehouse of kiddie porn. (Yuk!)

Obviously, sharing that sort of material is problematic, so what they do instead is create hashes of the pictures and videos in their database. (A hash is a unique string of letters and numbers calculated from the contents of a file.) Instead of manually clicking through and inspecting every picture on a suspect’s device, law enforcement can simply feed them into a hash-calculator then check the output against NCMEC’s hashes. Which is where Apple comes in.

Many tech companies like Google and Facebook already use the NCMEC hashes to scan their cloud servers for CSAM, but iOS15 will include an encrypted system that runs scans on users’ personal phones and laptops. That’s not a problem in itself. System scans are a de facto part of modern computing. Think of indexing software, or malware and antivirus scans. But those indexes are built and kept locally. Malware scans isolate the offending app and warn the user. This is different:

… Apple says that they will scan your Apple device for CSAM material. If they find something that they think matches, then they will send it to Apple. The problem is that you don’t know which pictures will be sent to Apple. You could have corporate confidential information and Apple may quietly take a copy of it. You could be working with the legal authority to investigate a child exploitation case, and Apple will quietly take a copy of the evidence.

To reiterate: scanning your device is not a privacy risk, but copying files from your device without any notice is definitely a privacy issue.

Think of it this way: Your landlord owns your property, but … he cannot enter any time he wants. In order to enter, the landlord must have permission, give prior notice, or have cause. Any other reason is trespassing. Moreover, if the landlord takes anything, then it’s theft. Apple’s license agreement says that they own the operating system, but that doesn’t give them permission to search whenever they want or to take content.

One Bad Apple, The Hacker Factor

In response to the outcry the announcement created, Apple released a FAQ about the proposed system which raised even more questions, and more ire. As Neal Krawetz on The Hacker Factor noted:

This FAQ contradicts their original announcement, contradicts itself, contains doublespeak, and omits important details. For example:

• The FAQ says that they don’t access Messages, but also says that they filter Messages and blur images. (How can they know what to filter without accessing the content?)

• The FAQ says that they won’t scan all photos for CSAM; only the photos for iCloud. However, Apple does not mention that the default configuration uses iCloud for all photo backups.

• The FAQ say that there will be no falsely identified reports to NCMEC because Apple will have people conduct manual reviews. As if people never make mistakes.

This is far from the complete list of issues with their FAQ.

One Bad Apple, The Hacker Factor

As security consultant Bruce Schneier points out, Apple are also changing the definition of “end-to-end encryption.”

No longer is the message a private communication between sender and receiver. A third party is alerted if the message meets a certain criteria.

Apple Adds a Backdoor to iMessage and iCloud Storage

And this has wide-ranging implications:

It opens the door for all sorts of other surveillance, since now that the system is built it can be used for all sorts of other messages.

Apple Adds a Backdoor to iMessage and iCloud Storage

How long, for example, before repressive governments like the Chinese add hashes of Tank Man to that database?

Tweet or share this:

Leave a Reply

Your email address will not be published. Required fields are marked *