Last Updated: August 20, 2021
Apple plans to implement a new feature that will monitor iCloud accounts for images of child abuse. While the move has been applauded by many, others are concerned about the privacy implications of this for the future.
The company says the privacy of non-offending users will not be impacted.
For the time being this new feature will be limited to the United States.
With it, Apple devices—iPhones, iPads, and Macs—will integrate a new feature that scans iCloud storage accounts for known images of child abuse. These scans will cross-reference child sexual abuse materials (CSAM) databases to find flagged images.
The feature uses a cryptographic process that takes place partly on the device and Apple’s servers.
Essentially it compares the “hashes” of images in a user’s cloud storage, to the hashes of CSAM images. Hashes are like a fingerprint that is unique to an image but don’t reveal the exact content of the image.
The initiative to fight child exploitation is undoubtedly positive.
Critics of the move however are worried about the potential for “surveillance creep”—surveillance technology implemented to do good today could be expanded at a later point to serve other purposes.
The current program will detect CSAM flagged images, and ultimately forward the information to the National Center for Missing and Exploited Children, and then escalate to involve law enforcement. In theory, this tech could be applied to report other materials and forward information to other authorities.
Critics point out how Apple has previously submitted to governmental pressure by removing apps used by pro-democracy protesters in Hong Kong.
The concern is that this technology could be used to sniff out political dissidents and activists in politically tense regions. It could leave parties that make use of backup and storage services in these regions vulnerable.
The general populace is increasingly privacy-conscious. One need only consider the rising mainstream popularity of VPNs to get an idea of the scale of this concern. So it’s not hard to understand the paranoia.
For now, Apple’s new program is a wholly good thing.
It just remains to be seen how it plays out in the future. It’s important to note that for its part, Apple has been increasingly pro-privacy this past year.