Controversial Child Protection Features Delayed by Apple after Privacy Outcry

An article on The Verge today (September 3, 2021) confirms Controversial Child Protection Features has been Delayed by Apple after Privacy Outcry. Initially, these features were supposed to roll out later this year.

Controversial Child Protection Features Delayed by Apple after Privacy Outcry

However, they announced last month (August) that they will be delaying its child protection features including a controversial feature that would scan users for child sexual abuse material (CSAM) following intense criticism that the changes could go on to diminish user’s privacy.

In light of this, Apple said in a statement to The Verge “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material.

Controversial Child Protection Features Delayed by Apple after Privacy Outcry

Apple says that based on feedback from customers, advocacy groups, researchers, and others they have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

The company’s initial press release about the changes which was intended to reduce the proliferation of child sexual abuse material (CSAM) has a similar statement at the top of the page. The release detailed three major changes in the works and one change was to search and Siri would point to resources to prevent CSAM If a user searched for information related to it.

The two changes left however came under more significant scrutiny. One of them was to alert parents when their kids were receiving or sending sexually explicit photos and would blur those images for kids. The last one would have scanned images stored in a user’s iCloud photos for CSAM and report them to Apple moderators who could then refer the reports to the national center for missing and exploited children or NCMEC.

Apple in Defense that this Does not Weaken Users Privacy

At length, Apple detailed the iCloud photo scanning system to make the case that it did not waken user’s privacy. Abbreviated, it scans photos stored in iCloud Photos on your iOS device and would access those photos alongside a database of known CSAM image hashes from NCMEC and other child safety organizations.

LEAVE A REPLY

Please enter your comment!
Please enter your name here