![](https://cdn.cnn.com/cnnnext/dam/assets/201008145007-04-apple-iphone-11-file-super-tease.jpg)
The plan centered on a new system that would have checked iOS devices and iCloud photos for child abuse imagery. It included a new opt-in feature that would warn minors and their parents of sexually explicit incoming or sent image attachments in iMessage and blur them.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Only after a certain number of hashes matched the NCMEC’s photos, Apple’s review team would be alerted so that it could decrypt the information, disable the user’s account and alert NCMEC, which could inform law enforcement about the existence of potentially abusive images.
Many child safety and security experts praised the intent of the plan, recognizing the ethical responsibilities and obligations a company has over the products and services it creates. But they also said the efforts presented potential privacy concerns.
“When people hear that Apple is ‘searching’ for child sexual abuse materials (CSAM) on end user phones they immediately jump to thoughts of Big Brother and ‘1984,’” Ryan O’Leary, research manager of privacy and legal technology at market research firm IDC, told CNN Business last month. “This is a very nuanced issue and one that on its face can seem quite scary or intrusive.”