More than 90 global privacy groups urge Apple to abandon CSAM surveillance

Groups warn the tech could be exploited by “abusive adults”

Apple is being urged to abandon plans to scan images and iMessages for child sexual abuse material (CSAM) over fears that the tech could threaten citizens' privacy and wellbeing, as well as inadvertently flag ‘innocent’ content.

This is according to an open letter signed by more than 90 civil society organisations, including the UK’s Big Brother Watch and Liberty.

Although the signatories “support efforts to protect children and stand firmly against the proliferation of CSAM”, they argue that the “algorithms designed to detect sexually explicit material are notoriously unreliable” and are known to “mistakenly flag art, health information, educational resources, advocacy messages, and other imagery”.

Moreover, the letter criticises Apple for assuming that the users of its iMessage surveillance technology, which aims to protect children from explicit content, will “actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship”.

According to the signatories, the tech could be exploited by “abusive adults”, providing them with even more power to control their victims. It could also lead to non-heteronormative children being outed against their will:

“LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk,” the letter reads. “As a result of this change, iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent.”

The letter, which is addressed to Apple CEO Tim Cook and is signed by privacy groups from across the US, Africa, Europe, South America, and East Asia, also echoed previous concerns of government interference in the surveillance technology, which could include Apple being pressured to “extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit”, such as: “human rights abuses, political protests, images companies have tagged as “terrorist” or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them”.

“And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis,” the letter states.

Apple had previously addressed these fears, maintaining that the technology would not scan users' iCloud uploads for anything other than CSAM, and that it would reject governmental requests to "add non-CSAM images to the hash list".

Related Resource

The future of CIAM

Four trends shaping identity and access management

The future of CIAM: Four trends shaping identity and access management - whitepaper from OktaDownload now

However, earlier this week, the tech giant appeared to bow down to some demands by announcing that it would only flag images that had been supplied by clearinghouses in multiple countries and not just by the US National Center for Missing and Exploited Children (NCMEC), as announced earlier.

The open letter comes as security researchers found a flaw in Apple’s NeuralHash hashing algorithm, which is used to scan for known CSAM imagery.

GitHub user Asuhariet Ygvar warned that NeuralHash “can tolerate image resizing and compression, but not cropping or rotations”, potentially reducing the success rate of the tech.

Apple has said the flaw only exists in a previous build of the technology and would not be present in the final product.

Featured Resources

2021 Thales access management index: Global edition

The challenges of trusted access in a cloud-first world

Free download

Transforming higher education for the digital era

The future is yours

Free download

Building a cloud-native, hybrid-multi cloud infrastructure

Get ready for hybrid-multi cloud databases, AI, and machine learning workloads

Free download

The next biggest shopping destination is the cloud

Know why retail businesses must move to the cloud

Free Download

Recommended

Apple reportedly slashes iPhone 13 production due to chip crisis
components

Apple reportedly slashes iPhone 13 production due to chip crisis

13 Oct 2021
Apple expected to unveil MacBook Pro revamp at 18 October event
Laptops

Apple expected to unveil MacBook Pro revamp at 18 October event

13 Oct 2021
Apple iPad (2021) review: The best entry-level iPad
tablets

Apple iPad (2021) review: The best entry-level iPad

12 Oct 2021
Apple MacBook Pro 15in vs Dell XPS 15: Clash of the titans
Laptops

Apple MacBook Pro 15in vs Dell XPS 15: Clash of the titans

11 Oct 2021

Most Popular

Best Linux distros 2021
operating systems

Best Linux distros 2021

11 Oct 2021
Apple MacBook Pro 15in vs Dell XPS 15: Clash of the titans
Laptops

Apple MacBook Pro 15in vs Dell XPS 15: Clash of the titans

11 Oct 2021
Veritas Backup Exec 21.3 review: Covers every angle
backup software

Veritas Backup Exec 21.3 review: Covers every angle

14 Oct 2021