Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They could comply with a nation state's demands to scan a billion iPhones for a politically dangerous photo.


You'd need 30 of them. And upload them iCloud.

I guess the neural part is a bit of a blackbox, im not sure if theres a way for external parties to verify its legitimatcy.

But again, any change to would go noticed.


You wouldn't need 30 of them or to upload them to iCloud.

If a nation state can demand Apple uses their existing function to scan your phone for political images, then they can likewise demand that 1 hit would be enough, and that there's no requirement for it to be uploaded to iCloud before the scanning can start.


https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

Read the technical documentation.

They're combining Private Set Intersection and Threshold Secret Sharing in a way that means that 1 hit, isn't enough. They can't even tell how many red flags you have.


A nation state would have to require Apple to change the algorithm on the device, requiring an OS level update.

Just by forcing Apple to add a bunch of image hashes they can't do anything, it'd still require multiple matches before Apple can see the images.

Oh and the OS sends fake encrypted "matches" at random to Apple's servers so that you can't use even that to fingerprint the user.


That’s bit how it works. They would need to have apple update iOS with the new hash, and then the photo uploaded to icloud.

If they change the system to just start scanning local photos globally there really is Nothing to stop them from pushing out an iOS update today that does the same thing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: