Search icon

News

06th Aug 2021

Apple to start scanning people’s messages and photos to find child abuse

Charlie Herbert

The company has insisted checks will be done with ‘user privacy in mind’ – but not everyone is convinced.

Apple is launching a new feature that will allow its devices to scan through people’s photos and messages to check for signs of abuse in a move that has concerned privacy experts.

Three new measures will be introduced later this year that will scan messages, photos, and additional features that will be added to Siri. These will be added in the latest updates across all Apple devices – iOS 15, iPadOS 15, WatchOS 8, and MacOS Monterey.

To begin with, it will be limited to the US.

Apple has said that the feature will be introduced in a way that keeps communications hidden from the company.

The first of the features will use the phone’s on-device machine learning to check the content of children’s messages for photos that look like they may be sexually explicit. This analysis will be done entirely on the phone itself according to Apple, and it will not be able to see those messages.

If a child receives such a message, it will be blurred and the child will be warned that it could be sensitive and given information about such messages, as well as being given the option to block the contact.

If a child then decides to view the message, they will be told that their parents will be alerted, and an adult will then be notified.

Similar protections are in place when children send messages, Apple said. Children will be warned before the photo is sent and parents can set up notifications when their child sends a photo that triggers the feature.

The second feature may prove to be more controversial. It looks through a device’s photos for possible Child Sexual Abuse Material (CSAM). Technology in iOS and iPadOS will scan through people’s iCloud photo library looking for such messages – but in a way that the company claims will be done “with user privacy in mind”.

The scanning will not take place in the cloud but on the device itself.

Professor Matthew Green from John Hopkins University said that despite the scanning technology being better than more traditional methods, they are still “mass surveillance tools.”

“The theory is that you will trust Apple to only include really bad images,” he wrote on Twitter.

“Say, images curated by the National Center for Missing and Exploited Children (NCMEC).

“You’d better trust them, because trust is all you have.”

Related links:

Apple to unveil 217 new emojis, including gender neutral faces

Netflix to start offering video games as part of service

Facebook and Twitter provide police with details of racists who abused England footballers