Will Apple child safety gift governments the keys to comms?


Apple has caused a stir with its new child safety measures, which other tech firms and privacy advocates say could unwittingly open the door to let governments spy on their citizens.

Later this year for US users, updates to iOS and iPadOS will scan users’ images for matches with known child sexual abuse material (CSAM) before storing them on its iCloud Photos platform.

The system works by using algorithms on a user’s device to turn an image into a lengthy numeric hash, and the iPhone will first upload this hash and compare it with a database of known images of child sexual abuse.

“This is a really bad idea,” says cryptography professor Matthew Green in a Twitter thread. Essentially it adds scanning systems to end-to-end encrypted messaging systems, so “imagine what it could do in the hands of an authoritarian government”, he asks.

Others argue hashing algorithms are not foolproof and may turn up false positives, worrying perhaps for parents with photographs of their children in the bathtub or at the beach. Furthermore, say other critics, child sexual abusers will simply buy a device without the feature.

“I think this is the wrong approach and a setback for people’s privacy all over the world,” says Will Cathcart, head of WhatsApp, which since 2014 has been part of Facebook. Countries where iPhones are sold will have “different definitions on what is acceptable,” and the system could very easily be used to scan private content for anything “a government decides it wants to control,” argues Cathcart.

Fuelling the dispute is a longstanding rivalry between Facebook and Apple, which offer competing messaging platforms. Apple CEO Tim Cook has accused the social media platform of selling users’ data to advertisers. Facebook’s Mark Zuckerberg, for his part, has warned investors that new privacy settings on Apple’s iOS 14.5 (called “App Tracking Transparency”), which make apps ask users’ permission to track their internet activity, could imperil the company’s revenues in the future. So for Zuckerberg, this is a highly welcome chance to argue maybe Apple isn’t quite as keen on privacy as it lets on.

  • BIG 5G Event

One comment

  1. Avatar Keith E Gould 10/08/2021 @ 8:33 am

    Whilst I applaud the detail Apple has put into this technology and understand that the intention is that initial releases will only work against a US Federal dataset and apply only to iOS devices in that region. However this is, as has been noted by folks more able than I, a technology which has great potential for misuse.

    The premise for its development is also laudable but the details lead me to believe that this mechanism could be repurposed as a means of surveillance for any content held on the central authority’s database.

    Apple gets to play Pontios Pilot. It remains true to its overall privacy stance whilst opening a surveillance door.

    With massive market risk in territories such as China, this can also be interpreted as a way of complying with Chinese laws concerning what their citizens can use iOS devices for without revenue damaging friction.

    A cynic could well conclude that this has less to do with protection of minors and more to do with protecting revenue.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.