Sexual predators better watch out because Apple isn’t messing around. The tech giant announced on Thursday that it will report images of child sex abuse detected on iCloud to law enforcement. Apple’s new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing. Images are transformed into unique numbers that … Continue reading “Apple Will Report Child Exploitation Images on iCloud to Law Enforcement”