Apple Will Report Child Exploitation Images on iCloud to Law Enforcement

Posted on

Sexual predators better watch out because Apple isn’t messing around.

The tech giant announced on Thursday that it will report images of child sex abuse detected on iCloud to law enforcement.

Apple’s new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing.

Images are transformed into unique numbers that correspond to that image as part of the system.

According to the company, the system is more private for users than previous approaches to eliminating illegal images of child pornography, because it uses sophisticated cryptography on both
Apple’s servers and user devices.

Though the company has already begin testing the service, most U.S. iPhone users won’t be part of it until an iOS 15 update later this year.

So how does the system work?

Before an image is stored in Apple’s iCloud, the company matches the image’s hash against a database of hashes provided by National Center for Missing and Exploited Children (NCMEC).

The company has explained that database will be distributed in the code of iOS beginning with an update to iOS 15. The matching process is done on the user’s iPhone, not in the cloud.

If Apple detects a certain number of violating files in an iCloud account, the system will upload a file that allows Apple to decrypt and see the images on that account. A person will then manually review the images to confirm whether or not there’s a match.

Only the mages that match content that’s already known and reported to these databases wil be able to be reviewed by the company.

If the person doing the manual review concludes the system did not make an error, then Apple will disable the user’s iCloud account, and send a report to NCMEC or notify law enforcement if necessary.

Users will be able to file an appeal to Apple if they think their account was flagged by mistake according to an Apple.

The system only works on images uploaded to iCloud, which users can turn off. This means that photos or other images on a device that haven’t been uploaded to Apple servers won’t be part of the system.

Disclaimer: We have no position in any of the companies mentioned and have not been compensated for this article.

Daily updates