CSAM scanning would be abused, says Apple – using argument it originally rejected

CSAM scanning would be abused | Unlocked padlocks

When Apple announced its own approach to CSAM scanning, many of us warned that the process used to check for child sexual abuse materials would ultimately be abused by repressive governments to scan for things like political protest plans.

The Cupertino company rejected that reasoning at the time, but in an ironic twist is now using precisely this argument in response to the Australian government …

Apple’s original CSAM scanning plans

Apple originally planned to carry out on-device scanning for CSAM, using a digital fingerprinting technique.

These fingerprints are a way to match particular images without anyone having to view them, and are designed to be sufficiently fuzzy to continue to match images which have been cropped or otherwise edited, while generating very few false positives.

To be clear, Apple’s proposal was a privacy-respecting approach, as scanning would be performed by our own devices, and nobody would ever look at any of our photos unless multiple matches were flagged.

The repressive government problem

The problem, as many of us observed, was the potential for abuse by repressive governments.

A digital fingerprint can be created for any type of material, not just CSAM. There’s nothing to stop an authoritarian government adding to the database images of political campaign posters or similar.

A tool designed to target serious criminals could be trivially adapted to detect those who oppose a government or one or more of its policies. Apple – who would receive the fingerprint database from governments – would find itself unwittingly aiding repression or worse of political activists.

Apple claimed that it would never have allowed this, but the promise was predicated on Apple having the legal freedom to refuse, which would simply not be the case. In China, for example, Apple has been legally required to remove VPNnews, and other apps, and to store the iCloud data of Chinese citizens on a server owned by a government-controlled company

There was no realistic way for Apple to promise that it will not comply with future requirements to process government-supplied databases of “CSAM images” that also include matches for materials used by critics and protestors. As the company has often said when defending its actions in countries like China, Apple complies with the law in each of the countries in which it operates.

Apple’s three-stage U-turn

Apple initially rejected this argument, but said that in response to widespread concern about it, it has decided to abandon its plans anyway.

The company subsequently shifted its stance to admitting that the problem existed.

Erik Neuenschwander, Apple’s director of user privacy and child safety, wrote: “It would […] inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

We’ve now reached stage three: Apple itself using the argument it initially rejected.

Apple uses argument against Australian government

The Australian government is proposing to force tech companies to scan for CSAM, and The Guardian reports that Apple is now using the slippery slope argument to fight the plan.

Apple has warned an Australian proposal to force tech companies to scan cloud and messaging services for child-abuse material risks “undermining fundamental privacy and security protections” and could lead to mass surveillance with global repercussions […]

“Scanning for particular content opens the door for bulk surveillance of communications and storage systems that hold data pertaining to the most private affairs of many Australians,” Apple said.

“Such capabilities, history shows, will inevitably expand to other content types (such as images, videos, text, or audio) and content categories.”

Apple said such surveillance tools could be reconfigured to search for other content, such as a person’s political, religious, health, sexual or reproductive activities.

The government says that it has listened to “a lot of good feedback” from Apple and others, and will “incorporate what we can” into a revised version of the plan.

Photo by FlyD on Unsplash


Add 9to5Mac to your Google News feed. 

FTC: We use income earning auto affiliate links. More.

Note: This article have been indexed to our site. We do not claim legitimacy, ownership or copyright of any of the content above. To see the article at original source Click Here

Related Posts
تويتر تعمل على خاصية لكتابة المقالات على المنصة thumbnail

تويتر تعمل على خاصية لكتابة المقالات على المنصة

عندما بدأت تويتر عملها كانت المساحة المتاحة للتغريد تسمح بكتابة 140 حرفًا فقط، لكن بعدها بسبب الطلب المتزايد للمستخدمين الذين يرغبون بالتعبير عن آرائهم بطريقة أوضح قررت الشركة مضاعفة المساحة، ثم بعدها قررت تدشين خاصية سلسلة التغريدات ليتسنى للمستخدمين ربط التغريدات معًا.هذا الأمر لم يتوقف عند ذلك بالنسبة لتويتر، حيث بدأت اختبار خاصية جديدة “Twitter…
Read More
Index Of News
Total
0
Share