Home Tech Apple removes controversial child porn feature

Apple removes controversial child porn feature

Known as CSAM, this was one of the features announced by Apple in late 2021 to combat the spread of child pornography. However, the American’s approach raised serious concerns that now dictate her abandonment.

The news was given by an Apple representative to WIRED. After several months of trying to develop the best possible tool, the technology ended up giving up this preventive measure.

Apple drops controversial CSAM feature

“After extensive expert consultation to gather feedback on the child protection initiatives we proposed last year, we have deepened our investment in the communication security feature we first made available in December 2021. We have decided not to move forward with our screening tool. Previously proposed CSAM for iCloud Photos. Kids can be protected without companies reviewing personal data, and we’ll continue to work with governments, children’s advocates, and other companies to help protect children young people, preserve their right to privacy and make the Internet a safer place for children… and for all of us.”

apple

These were the statements made by the American company to inform the end of the development of this controversial tool. The protection of minors will continue to be a priority for the company, however, it will have to be taken care of with a different approach.

I remember that the CSAM consisted of an automatic detection tool for content related to the sexual abuse of minors. It examined the photos of the users present in iCloud in search of illegal material.

The results would be crossed with a predefined database. If a positive case was detected, the image would automatically blur to prevent the spread of this content.

Also, someone within Apple would need to manually verify the flagged photo. If the presence of illegal content and content related to the abuse of minors is verified, it will be subsequently reported to the competent authorities.

Despite Apple’s good intentions with this tool, its operation raised serious fears about user privacy. This automatic search was seen as an unequivocal attack on the privacy of users, opening a back door on all devices.

After the various advances and setbacks of this tool, now Apple has decided to end its development. In return, the company announced the availability of end-to-end encryption for all content on iCloud.

No Comments

Leave A Reply

Please enter your comment!
Please enter your name here

Exit mobile version