How Apple will scan your iPhone for child sex abuse photos | HT Tech

How Apple will scan your iPhone for child sex abuse photos

  • Apple has run into a controversy over its intention to scan iPhones for child sex abuse photos.

By: PRAKHAR KHANNA
| Updated on: Aug 17 2021, 12:32 IST
“This feature does not work on your private iPhone photo library on the device,” says Apple.
“This feature does not work on your private iPhone photo library on the device,” says Apple. (REUTERS)
“This feature does not work on your private iPhone photo library on the device,” says Apple.
“This feature does not work on your private iPhone photo library on the device,” says Apple. (REUTERS)

Apple recently announced that it will roll out an update later in the year (in the US) that will help curb child sexual abuse. However, the feature hasn't been very well received since Apple said it would scan everyone's iPhone for Child Sexual Abuse Material (CSAM). After the backlash, Apple revealed details about how the new iPhone child sex abuse system works. According to the company, a user's content will be scanned when it is uploaded from an iPhone or iPad to iCloud. Here's how Apple will scan iPhones for child sex abuse photos.

Apple will assign each image “hash codes.” These codes will classify what the media consists of. The hashes will then be compared to an encrypted database of known CSAM to flag matches. Once this is done, and if Apple finds that a user has 30 images (a threshold) that match the hashes of known CSAM photos (stored list of hashes from NCMEC), the photos will be decrypted on Apple's servers.

As a result, these images will then be examined by human reviewers, and if flagged, they will alert the authorities. The company is also required to report the National Center for Missing and Exploited Children (NCMEC), a nonprofit that works alongside law enforcement. Apple says that there is less than a one-in-one-trillion chance of incorrectly flagging an account per year.

As The Verge points out, CSAM scanning isn't a new idea. Facebook, Twitter, and many other companies scan users' files against hash libraries. Apple is facing backlash because many believe that the new system will open a backdoor for unethical use of the feature. However, Craig Federighi, Apple's senior vice president for engineering told The Wall Street Journal that the system is limited solely to looking for copies of known, reported images of child pornography.

In a document shared by Apple, the company says that the feature is limited to the “photos that the user chooses to upload to iCloud Photos.” It goes on to mention, “This feature does not work on your private iPhone photo library on the device.”

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 17 Aug, 12:22 IST
Tags:
NEXT ARTICLE BEGINS