Close button
The Guardian
Email YouTube Facebook Instagram Twitter WhatsApp
Everything you need to live well
x

Apple Plans To Scan US iPhone For Child Sexual Abuse Images

Apple Plans To Scan US iPhone For Child Sexual Abuse Images

Apple will scan photo libraries stored on iPhones in the US for known images of child sexual abuse, the company says, drawing praise from child protection groups but crossing a line that privacy campaigners warn could have dangerous ramifications.

x

The company will also examine the contents of end-to-end encrypted messages for the first time.

As part of new safeguards involving children, the company also announced a feature that will analyze photos sent and received in the Messages app to or from children to see if they are explicit.

Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.

If Apple detects a threshold of sexually explicit photos of children in a user’s account, the instances will be manually reviewed by the company and reported to the National Center for Missing and Exploited Children, or NCMEC, which works with law enforcement agencies.

Apple said images are analyzed on a user’s iPhone and iPad in the U.S. before they are uploaded to the cloud.

Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC.

The company is using a technology called NeuralHash that analyzes images and converts them to a hash key or unique set of numbers. That key is then compared with the database using cryptography. Apple said the process ensures it can’t learn about images that don’t match the database.

Any user who feels their account has been flagged by mistake can file an appeal, the company said.

To respond to privacy concerns about the feature, Apple published a white paper detailing the technology as well as a third-party analysis of the protocol from multiple researchers as reported by Aljazeera.

The feature applies to Apple’s iMessage service and other protocols like Multimedia Messaging Service.

In this article:
AppleiPhoneSexual abuseUS
Receive News Alerts on Whatsapp: +2348136370421

Related