Sept. 3 (UPI) -- Tech giant and iPhone maker Apple announced on Friday that it's delaying a controversial technology that will scan users' smartphones to search for images of child sexual abuse, following backlash from privacy advocates.
The company announced the system last month and said it was designed to increase child safety. Apple said it checks images on users' phones via iCloud servers and can detect content that's known to the National Center for Missing and Exploited Children.
Friday, Apple said it will now take more time to develop the technology after pushback from privacy advocates who fear the system could set a dangerous precedent.
"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," Apple said in an update posted to its website.
"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
Apple's technology would give parents tools to help their children navigate online, inform law enforcement about illegal sexual materials and allow Siri and Search to block related topics.
Apple said previously that the program does not compromise users' privacy because the scans see the images as sets of numbers, analogous to digital fingerprints, through a process called hashing.