Apple puts plans to scan its users’ photos on hold due to backlash
Last month Apple announced plans to introduce a number of measures to protect children from abuse, which included scanning images stored in iCloud Photos.
September 6, 2021
Last month Apple announced plans to introduce a number of measures to protect children from abuse, which included scanning images stored in iCloud Photos.
“Another important concern is the spread of Child Sexual Abuse Material (CSAM) online,” said the press release. “CSAM refers to content that depicts sexually explicit activities involving a child. To help address this, new technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC).”
Obviously, the desire to protect children from abuse is laudable and, as a principle, cannot be objected to. But the precedent set by the decision to scan people’s private photos, however good the reason, led to a lot more alarm and pushback than Apple presumably anticipated. As a result Apple has now decided to put the whole initiative on hold.
“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” said a recent update to the press release. “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple should be commended for listening to the market but it’s somewhat concerning that it took such outcry for the issues presented by setting such a precedent to be acknowledged. “The features Apple announced a month ago, intending to help protect children, would create an infrastructure that is all too easy to redirect to greater surveillance and censorship,” wrote the Electronic Frontier Foundation, a digital civil liberties defender, in response to the development.
The core objection is that, if you create a mechanism for spying on the activities of people for a good reason, you also open the possibility of it being used for bad reasons. For all Apple’s reassurances that such a thing would never happen, it’s hard to see how it could ensure it once the genie is out of the bottle. Any such security backdoors must be viewed in the context of governments around the world using antitrust as a weapon to make tech companies do their bidding and you have to wonder what made Apple attempt such a controversial move.
About the Author
You May Also Like