Apple to scan iPhones for child sex abuse images

Apple has announced details of a new safety tool designed to identify child sexual abuse material (CSAM) on users’ phones. The new technology will allow Apple to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.

Apple will also launch a new feature in the Messages app which will warn children and their parents using linked family accounts when sexually explicit photos are sent or received.

The tool will show a warning to a child when they are sent sexually explicit photos, blurring the image and reassuring them that it is OK if they do not want to view the image as well as presenting them with helpful resources.

The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user’s photo album.

Read the full story from the Evening Standard below:

Source: Apple to scan iPhones for child sex abuse images

Share This Story

Leave A Comment

Get Involved!

Share Your own Safeguarding News and Research to reach a wider Audience

From Our Blog


Understand the impact and risks associated with cyber-bullying, learn how to identify indicators and read tips on prevention.

Modern Slavery – an introduction

Modern slavery is prevalent in the UK. Safeguarding Hub examines what modern slavery is and looks at the various types, from organ harvesting to sexual exploitation.

Breast Flattening – a guide

Breast ironing is an age-old tradition, practiced in certain parts of Africa. The pounding and massaging of a girl’s breasts to delay breast development. In the UK it is child abuse. Read our guide to find out more.

The Safeguarding Hub

Share Your Safeguarding News And Research To Reach A Wider Audience