Apple has announced details of a new safety tool designed to identify child sexual abuse material (CSAM) on users’ phones. The new technology will allow Apple to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.
Apple will also launch a new feature in the Messages app which will warn children and their parents using linked family accounts when sexually explicit photos are sent or received.
The tool will show a warning to a child when they are sent sexually explicit photos, blurring the image and reassuring them that it is OK if they do not want to view the image as well as presenting them with helpful resources.
The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user’s photo album.
Read the full story from the Evening Standard below: