Apple on Thursday announced it will implement new technology on its devices allowing it to detect images of child exploitation uploaded to iCloud in the United States and report them to proper authorities. File Photo by John Angelillo/UPI |
License Photo
Aug. 5 (UPI) -- Apple on Thursday announced it will implement a system to identify and report images of child exploitation uploaded to its iCloud storage system to law enforcement.
The system will detect content featuring Child Sexual Abuse Material, or CSAM, already known to the National Center for Missing and Exploited Children uploaded to the cloud storage system in the United States through a method known as hashing that transforms images into a unique set of corresponding numbers, Apple said in a statement.
The new technology in Apple's iOS and iPadOS will match an image's hash against a database of CSAM provided by the NCMEC before it is uploaded to iCloud and if a certain number of violating files are found in an iCloud account, Apple will manually review the images to determine whether there is a match.
If a match is detected Apple will disable the user's iCloud account and send a report to NCMEC or notify law enforcement.
Apple said the program maintains user privacy as the database is stored on as an unreadable set of hashes stored on users' devices and a cryptographic technology known as the private set intersection is used to determine a match without revealing what is in the image.
If the threshold of CSAM is reached, the system will upload a file allowing Apple to decrypt the images to allow a person to conduct the review.
The system only works on images uploaded to iCloud, which can be disabled, and Apple will only be able to review content already included in the NCMEC database as the company said the threshold will provide an "extremely high level of accuracy" to ensure less than a one in 1 trillion chance per year of incorrectly flagging an account.
Apple began testing the system on Thursday but it will be widely distributed among devices along with an update to iOS 15, CNBC reproted.
The update will include other features aimed at preventing child sexual exploitation including using machine learning to determine if a child under 13 is receiving or sending sexually explicit content in iMessage to warn them and their parents and updates to Siri to provide information on how to report child exploitation.