Apple: System for Identifying Child Abuse Photos Does Not Create a Back Door

Apple has responded to concerns about its next steps to limit the spread of child abuse material.

The company claims that its tool to locate child abuse images on a user’s device it does not create a back door or reduce privacy.

The company reiterated that does not scan the device owner’s entire photo library for abusive imagesInstead, it uses encryption to compare the images to a known database provided by the National Center for Missing & Exploited Children.

Some privacy advocates were concerned after Apple announced that the company would scan a user’s entire photo collection. Although, the company is using an algorithm on the device itself, not its servers, to detect sexually explicit images.

Apple said it would manually review abusive photos on a user’s device only if the algorithm finds a certain number of them. The company also said it can adjust the algorithm over time.

Apple said that don’t break end-to-end encryption with a new feature in the Messaging app that scans photos sent to or from a child’s iPhone for explicit material, nor will the company have access to the user’s messages.

On Thursday, the Electronic Frontier Foundation He said that Apple is opening a back door with new tools.

Apple said the system has been in development for years and wasn’t built for governments to monitor citizens. The system is only available in the US, Apple said, and It only works if the user has iCloud Photos enabled.

Dan Boneh, a cryptography researcher who worked with Apple on the project, defended the new tools.

Read Also:  Apple reveals that the iPhone 15/15 Pro battery deteriorates significantly less than expected

Recent Articles

Related News

Leave A Reply

Please enter your comment!
Please enter your name here