Apple’s Plans to Scan Phones for Child Abuse Material Have Been Put on Hold.

Apples+Plans+to+Scan+Phones+for+Child+Abuse+Material+Have+Been+Put+on+Hold.

According to The Verge, Apple has officially postponed the implementation of its contentious plans to scan iPhones for child sexual abuse material (CSAM).

“We revealed plans last month for capabilities to help safeguard children from predators who use communication tools to recruit and exploit them, as well as limit the distribution of Child Sexual Abuse Material,” Apple said in a statement.

It’s a noteworthy, and unusual, instance of self-reflection. Apple is paying attention to its detractors. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the statement continues.

Apple revealed last month that it would scan all data uploaded to its iCloud Photos service and compare hashes, which are unreadable digital representations of each image, to a CSAM database already in existence. The company would notify the National Center for Missing and Exploited Children if it found positive matches (NCMEC). Apple also announced a new function that will notify parents if their youngster receives or sends sexually inappropriate photos and will instantly obscure them.

Many internet privacy advocates were alarmed by the disclosure, with some claiming that it might create a dangerous precedent for Apple searching for other data in the future. That would be particularly concerning development in nations like India and China, where the government has a greater influence on what is disseminated online.

“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” the Electronic Frontier Foundation said at the time. ” As a consequence, even a well-intentioned attempt to build such a system will break key promises of the messenger’s encryption and open the system to broader abuses.”

Apple has put its plans to scan iPhones for CSAM on hold for the time being. It’s unknown what the internet giant’s future child safety features will look like, but whatever they are, privacy advocates will undoubtedly scrutinize them.

 

Source: https://www.theverge.com/2021/9/3/22655644/apple-delays-controversial-child-protection-features-csam-privacy