Matthew Green, professor of cryptography at John Hopkins, published a tweet in which he claims that Apple may soon introduce a feature to identify child abuse photos on users’ iPhones.
A hash is a cryptographic function that, using a mathematical algorithm, transforms any block of data (for example, an image) into a series of fixed-length characters. This hash is, in a way, a fingerprint that identifies the image.
Apparently, Apple will generate the hash of the reel images on the iPhone itself. Instead of uploading your photos and comparing them to a database, Apple will download a set of hashes of child abuse photos on your device and compare them to the photos on your roll.
If they don’t find anything, nothing happens, but if you have a photo on your device that matches one of the downloaded hashes, then probably some manual process will be triggered. Apple hasn’t officially announced anything yet, so it’s unclear exactly how it will work.
Green points some potential complications, like the government verifying these “fingerprints”. In addition to trying to seek out images of child abuse, it could be used to suppress activism and political opponents.
On the other hand, it is worth remembering that, as 9to5Mac Please note that photos uploaded and stored in iCloud Photos are not encrypted end-to-end. Although they are stored encrypted on Apple’s servers, the company has the decryption keys it would have to hand over to authorities if it received a subpoena.