Apple to scan iPhones for child sex abuse images

Apple has announced details of a new safety tool designed to identify child sexual abuse material (CSAM) on users’ phones. The new technology will allow Apple to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.

Apple will also launch a new feature in the Messages app which will warn children and their parents using linked family accounts when sexually explicit photos are sent or received.

The tool will show a warning to a child when they are sent sexually explicit photos, blurring the image and reassuring them that it is OK if they do not want to view the image as well as presenting them with helpful resources.

The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user’s photo album.

Read the full story from the Evening Standard below:

Source: Apple to scan iPhones for child sex abuse images

Share This Story

Leave A Comment

Get Involved!

Share Your own Safeguarding News and Research to reach a wider Audience

From Our Blog

Types of Child Exploitation

An overview of differnt types of child exploitation, including sexual exploitation, county lines and criminal exploitation

The Safeguarding Hub

Share Your Safeguarding News And Research To Reach A Wider Audience