Apple to scan iPhones to fight child pornography, company accused of espionage

Apple plans to scan iPhones, iPads and iClouds for child pornography. Specifically, the California giant will use algorithms to detect signs of sexual abuse involving children. Many privacy advocates fear that this new feature could be diverted from its original purpose.

This Thursday, August 5, 2021, Apple announced a series of new features aimed at curbing the spread of child pornography videos on iOS 15. The Cupertino company will now identify images of a sexual nature involving children on its iPhone, iPad and its iCloud server in the United States.

Photos stored on iCloud or exchanged by iMessage will be analyzed by algorithms. These algorithms will compare the image’s unique digital signature with digital signatures of pedophile photos contained in a database. To power its system, Apple will rely on photos provided by the National Center for Missing and Exploited Children (NCMEC).

Edward Snowden criticizes iOS 15’s new measures

If the signature matches, the photo will be tagged. When an account exceeds a certain tagged photo limit, an algorithm accesses the photo to check for objectionable content. An employee can then take a look at the image. In case of illegal content, the account will be reported to the authorities.

Along the same lines, Apple will keep an eye on the images exchanged for children’s accounts linked to a family subscription. If the child submits a sexual image, they will receive an iOS warning. Apple also reserves the right to notify the parents. “We want to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of child pornography.” Apple explains in a press release.

Read Also:  Germany is at the bottom of the list in Europe when it comes to fiber optic expansion

These measures generated a wave of criticism. For many privacy advocates, Apple is going too far, despite its laudable intentions. “Apple is replacing its end-to-end encrypted messaging system with a surveillance and censorship infrastructure that will be vulnerable to abuse and abuse not just in the United States, but around the world.” deplores the Center for Democracy and Technology (CDT).

For the CDT, Apple consciously integrates a back door for your iPhones. “The mechanism that will allow Apple to scan images into iMessages is not an alternative to the back door – it’s a back door. Client-side analysis at one end of the communication breaks the security of the transmission and informing a third party (the parent) of the content of the communication is an invasion of privacy”, face the organism.

For his part, whistleblower Edward Snowden he fears that the system implemented by Apple will end up being exploited for other purposes. “No matter how good the intentions, Apple is rolling out mass surveillance around the world. Make no mistake: if they can search child pornography today, they can search anything tomorrow”, warns Snowden.

Recent Articles

Related News

Leave A Reply

Please enter your comment!
Please enter your name here