Im not sure if im missing something here, but from what I understand, what Apple is proposing is an enormous privacy boon that seems to be completely misunderstood.
All cloud providers are required to, and do, make these scans.
The proposal provides a way for them to comply with that requirement, but at the same time, allows them to completely lock themselves out of being able to decrypt your data in the cloud, except in this specific case, secured and controlled by your device.
This means they couldn't comply with a nation states demand to secretly decrypt your data, even if they were legally compelled too, unless they change how the cryptography at play here works.
- I don't want to be treated like a pedophile for a device that I should, in theory, own because I paid for it.
- I don't want the rules to change in the future for what will be scanned.
- I don't want to support Apple scanning for pictures of Tank Man for the CCP.
- I don't want an untested proprietary algorithm making decisions about me that could alter my life. It's already been shown that you can make innocent pictures that collide with CP hashes. This screams of future abuse.
I see these points. But I guess personally, as someone that benefits financially from technology and the internet, I feel some level of responsibility to ensure that it isn't used to exploit the most vulnerable in society.
There is no requirement to scan, only report when found.
There is NO end to end encryption being rolled out at the same time (so they can still decrypt and hand over your data in the cloud).
> This means they couldn't comply with a nation states demand to secretly decrypt your data, even if they were legally compelled too, unless they change how the cryptography at play here works.
Not only is this completely wrong (they can hand over your decrypted cloud data right now), it's wrong even going forwards with the assumption you made above about real E2EE (which we don't have...) - All they have to do is add a few hashes from that nation state to their catalog and your data is whisked away to apple for them to do with as they please (which right now is meaningless, since they currently have access to it anyways, but makes your point about E2EE a lot less compelling).
I didn't realise that theres no requirement to scan. So I'll yield on that.
But its still possible that Apple are preparing for a scenario where it does become a requirement in the future.
As for the E2EE part, I can't imagine that this wouldn't be launched with E2EE alongside, otherwise theres literally no point whatsoever, they could have just done the scan on iCloud.
As for why this combats 'your data being whisked away', check the technical documentation. What they're doing with Private Set Intersection and Threshold Secret Sharing are clear steps to make this system unexploitable, anonymous, and so that it doesn't leak any metadata whatsoever.
The concern is that Apple is handing governments a new tool to go “give me a list of all users that have this photo”. It could track down dissidents based on this and combined with metadata is probably sufficient to pinpoint who took a particular picture.
Think you shared that picture of police brutality anonymously? Think again.
But what im getting at. Is that this is exactly what Apple is trying to fight.
A government could already coerce Apple into handing over iCloud data.
The cryptography at play here, combined Private Set Intersection and Threshold Secret Sharing are clear steps to make it as hard as possible for any institution to body this for that reason.
Y'all are starting to make me more sympathetic to Apple. Their biggest misstep was making these announcements simultaneously without foreseeing how many people would (willingly or otherwise) conflate them.
This is not true. There were multiple features announced. iMessage data is not being scanned by most phones. The ONLY CSAM detection that happens is on photos being uploaded to iCloud Photos.
The only phones where iMessage photo scanning happens are those for children under a certain age (maybe 13?) whose parents who have opted into child protection where the phone scans for nude photos and notifies the parents.
People are conflating these two _different_ but _related_ features and their goals and limits.
> The proposal provides a way for them to comply with that requirement, but at the same time, allows them to completely lock themselves out of being able to decrypt your data in the cloud, except in this specific case, secured and controlled by your device.
If the CSAM detection were released with encrypted iCloud backups (or generally E2EE iCloud), then I think I'd probably be entirely less outraged. This narrative would make sense then.
Apple claims the on-device CSAM scanning only occurs on device if the photo in question is uploaded to iCloud. Apple is surely already scanning iCloud photos on their cloud, so without E2EE to iCloud, what's the point?
My ability to control a device I own, with Apple, was already suspect. Apple's use of on-device scanning is a slippery slope: it's not a question of if it will be used for nefarious means, but when. The claim that the scanning is only done for cloud-destination images is suspect and could be changed by gagged government coercion and a minor update.
The feature itself has little going for it in efficacy too: like most of the US's punishment bureaucracy / legal system / policing. Real child predators likely already avoid the cloud. These kind of slippery slope arguments against crime are often used by law enforcement to erode privacy rights: they don't actually catch more bad guys, but they do spy on more law abiding citizens.
Who's to say that in the future, law enforcement won't use this kind of technology for the war on drugs or other failed law enforcement initiatives? Law enforcement often uses violence or terrorism, for example, as justification for its own expansion and more privacy-invading initiatives, but terrorism is a very low threat, and only 4% of crimes in the US are violent (as defined by the FBI). CSAM has been used for decades as justification for the state to nose more in private citizens' business too. It's certainly an issue, but there have to be better ways to reduce child harm.
You wouldn't need 30 of them or to upload them to iCloud.
If a nation state can demand Apple uses their existing function to scan your phone for political images, then they can likewise demand that 1 hit would be enough, and that there's no requirement for it to be uploaded to iCloud before the scanning can start.
They're combining Private Set Intersection and Threshold Secret Sharing in a way that means that 1 hit, isn't enough. They can't even tell how many red flags you have.
That’s bit how it works. They would need to have apple update iOS with the new hash, and then the photo uploaded to icloud.
If they change the system to just start scanning local photos globally there really is
Nothing to stop them from pushing out an iOS update today that does the same thing.
> All cloud providers are required to, and do, make these scans.
No scanning is required by law. In fact, the law states you don't have to go out of your way to invade privacy to scan. So yes, you and tons of other people are missing a big piece in this.
All cloud providers are required to, and do, make these scans.
The proposal provides a way for them to comply with that requirement, but at the same time, allows them to completely lock themselves out of being able to decrypt your data in the cloud, except in this specific case, secured and controlled by your device.
This means they couldn't comply with a nation states demand to secretly decrypt your data, even if they were legally compelled too, unless they change how the cryptography at play here works.
Which I expect, would not go unnoticed.
Am I missing something?