Apple may have suspended some of its plans to scan devices for CSAM material, but the European Commission has put it back in the spotlight by forcing courier services to start monitoring such material.
CSAM is becoming a privacy test
In terms of child protection, this is a good thing. Child Sexual Exploitation Material (CSAM) is a much bigger problem that many people realise; the victims of this appalling trade end up with shattered lives.
What is happening, according to EURACTIV, is that the European Commission plans to introduce measures requiring courier services to perform CSAM material scans. However, Europe seems to understand some of the arguments raised against Apple’s initial proposals by privacy advocates, and insists on certain restrictions, including:
- Scanning technology must be “efficient”.
- It must be “suitably reliable”.
- And he must avoid collecting “any other information from the relevant communications than the information strictly necessary for detection”.
Of course, ensuring the system is “reliable” is a challenge.
What is reliable?
When Apple announced its own take on CSAM analytics on its platforms, researchers at Imperial College London soon warned that the technology behind the system was easy to foolcalling it “not ready for deployment”.
Apple then backed off its plans, and later introduces a system to monitor this content in its Messages app. It has yet to extend this to on-device scanning of people’s photo libraries, as it originally intended. It is still entirely possible that it scans photos stored in iCloud, as other image storage cabinets do.
As far as Europe’s proposals are concerned, there is hope that the injunction to create “adequately reliable” systems will eventually face a certain burden of proof. Although these restrictions do not completely reassure people, as the threatens of these technologies being abused by repressive or authoritarian governments remainsit at least sets in motion steps that could revolve around an understanding of what people’s online privacy rights should be.
At the same time, the EC proposals appear to threaten the use of end-to-end encryption, which Apple continues to claim to protect.
Towards a digital charter of privacy rights
The lack of a clear and agreed set of rights to protect online privacy is becoming increasingly critical as the world becomes more connected. At the same time, Europe is also insisting on regulations – such as mandatory sideloading – that can erode device privacy and security. These two strands seem philosophically opposed, but it’s possible that as regulators and legislators consider the complexity of these issues, they’ll begin to see a glimmer of light.
I think that’s what Apple is trying to encourage, as it seems more and more vital (even the The World Economic Forum agrees) that an international standard to define digital rights is developed. And the need for this standard is growing.
Europe understands this; this submit a statement on digital rights and principles for EU residents early 2022.
At the time, Executive Vice-President for a Europe Fit for the Digital Age, Margrethe Vestager, said in a statement: “We want secure technologies that work for people and respect our rights and values. Also when we are online. And we want everyone to be empowered to take an active part in our increasingly digitized societies. This declaration gives us a clear point of reference on the rights and principles of the online world. »
What should those rights be?
Apple executives have been actively push for frameworks around these rights for some time. Ever since Apple CEO Tim Cook hard-hitting speech on digital surveillance in 2018, the company consistently and (mostly) consistently pushed for a privacy agreement. Cook’s company continues to strive to provide these rights on a unilateral basis, but also calls for the universality of this protection. Apple has championed the following four pillars:
- Users should have the right to minimize personal data.
- Users should have the right to know what data is collected about them.
- Users must have the right to access this data.
- Users should have the right to ensure that this data is secure.
While we are all aware that certain business models will be forced to change Following such a set of rights, the introduction of some digital certainty would at least help to promote a level playing field in technology.
And the need to strike a thoughtful balance between individual rights and collective responsibility seems stronger today than ever.
Copyright © 2022 IDG Communications, Inc.