Apple defends new device-scanning tech as criticisms grow

Apple has launched a defence of its controversial new system to scan users' devices for child sexual material after over 5,000 people and organizations signed an open letter against it.

August 10, 2021

2 Min Read
privacy

By Pádraig Belton

Apple has launched a defence of its controversial new system to scan users’ devices for child sexual material after over 5,000 people and organizations signed an open letter against it.

The critics argue the scanning system, which hashes images on a user’s device and compares the hash with known images of sexual abuse material, also creates a backdoor authoritarian governments can use to spy on their people, crack down on political dissent, or enforce anti-LGBT policies.

Tim Cook’s company has responded by publishing a lengthy question-and-answer document, and pledging it “will not accede to any government’s request to expand” the system. Critics, though, point out Apple has made concessions in the past to continue operating in countries around the world. It removed 39,000 apps from the Chinese App Store at the end of last year, after Beijing embarked on a crackdown on unlicensed games. The government has also made Apple remove VPN, news, and other apps, and to store iCloud data of Chinese citizens on a server owned by a government-controlled company.

Adapting the system from searching for child sexual abuse material (CSAM) to, for example, images of the 1989 Tank Man in Tiananmen Square or Russian dissident Alexei Navalny would only take “expansion of the machine learning parameters to look for additional types of content”, argues the Electronic Frontier Foundation, which adds, “that’s not a slippery slope; that’s a fully-built system just waiting for external pressure to make the slightest change”.

There are three different measures at work in Apple’s plans: as well as scanning for digital fingerprints, iMessage will now enable explicit photo warnings for childrens’ accounts, and Siri and search will respond to requests for CSAM materials with a warning and links to help. The measures use different technologies.

But at the end of the day, says the Electronic Frontier Foundation, “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor”.

You May Also Like