In a surprise announcement on Friday, Apple said it would take more time to improve its controversial child safety tools before introducing them.
More comments requested
The company says it plans to get more feedback and improve the system, which had three key components: scanning iCloud photos for CSAM hardware, scanning messages on the device to protect children and research suggestions designed to protect children.
Since Apple announced the tools, it has faced a deluge of criticism from concerned individuals and rights groups around the world. The big argument the company seemed to have a problem with seems to have been the potential for repressive governments to force Apple to monitor more than CSAM.
Who is watching the guards?
Edward Snowden, accused of leaking US intelligence and now privacy advocate, notified on Twitter, “Make no mistake: if they can search child porn today, they can search anything tomorrow.”
Critics said these tools could be exploited or extended to support censorship of ideas or otherwise threaten free thought. Apple’s response – that it wouldn’t expand the system – was seen as a bit naive.
“We have already faced requests to create and deploy government-mandated changes that degrade user privacy, and we have firmly refused those requests. We will continue to refuse them in the future. Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government request to extend it, ”the company said.
“All it would take to widen the narrow backdoor that Apple is building is an extension of machine learning settings to find additional types of content,” retorted the Electronic Frontier Foundation.
Apple listens to its users (in a good way)
In a widely circulated statement to the media (the Friday before a public holiday in the United States, when bad news sometimes breaks) about the suspension, Apple said:
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take more time over the next few months to gather feedback and make improvements before releasing these features of the safety of critically important children. “
It’s a decision the company had to make. In mid-August, more than 90 NGOs contacted the company in an open letter asking it to reconsider its decision. This letter was signed by Liberty, Big Brother Watch. ACLU, Center for Democracy & Technology, Center for Free Expression, EFF, ISOC, Privacy International and many more.
The devil in the details
Organizations warned of several weaknesses in the company’s proposals. One of them is very important: the system itself can be abused by violent adults.
“LGBTQ + youth in family accounts with unfriendly parents are particularly at risk,” they wrote. “As a result of this change, iMessages will no longer provide privacy and privacy to these users. “
The fear that the system proposed by Apple could be extended also remains. Sharon Bradford Franklin, co-director of the CDT Security & Surveillance Project, warned that governments “will demand that Apple find and block footage of human rights abuses, political protests and other content that should be protected as freedom. expression, which is the backbone of a free and democratic society.
Apple advocates have said what Apple has been trying to achieve is maintain overall user data privacy while creating a system that can only retrieve illegal content. They also highlighted the various integrated security features that the company has integrated into its system.
Those arguments didn’t work, and Apple executives surely echoed the same sort of social media comments I’ve seen, which represented deep mistrust of the proposals.
What happens next?
Apple’s statement does not say so. But given that the company has spent weeks since the announcement meeting with media and relevant groups in all of its markets on this matter, it seems logical that the second iteration of its child protection tools could address some concerns.
Copyright © 2021 IDG Communications, Inc.