Apple Announces Measures to Limit the Spread of Child Sexual Abuse Material

The company is introducing child safety features in three areas, developed in collaboration with child safety experts.

First, the messaging app will use on-device machine learning to alert children and parents about sensitive content, keeping private communications unreadable to Apple.

On the other hand, iOS and iPadOS will use new encryption apps to help limit the propagation of CSAM online, while it is designed for user privacy. Discovering this material in the iCloud Photos app on your device will provide law enforcement authorities with valuable information about CSAM collections.

Lastly, Siri and surveys provide parents and children with extensive information and assistance if they encounter unsafe situations. Siri and polls will also intervene when users try to research CSAM-related topics.

These features will arrive later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.

The Messaging app will add new tools to notify children and their parents when they receive or send sexually explicit photos.

Upon receiving this type of content, the photo will be blurred and the child will be notified. Helpful resources will be introduced to you and assured that it’s no problem if you don’t want to see this photo. As an added precaution, the child can also be told that, to make sure they are safe, your parents will get a message if they see it.

There are similar protections if a child tries to send sexually explicit photos. The child will be notified before sending the photo, and parents can receive a message if the child decides to send it.

Read Also:  The risks of “Sora”, the latest release from the developer of ChatGPT, which converts text to video

Recent Articles

Related News

Leave A Reply

Please enter your comment!
Please enter your name here