Apple Plans to Make iOS Detect Child Abuse Photos: Report

0
679

In a briefing on Thursday afternoon, Apple confirmed previously reported plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery, but clarified crucial details from the ongoing project. For devices in the US, new versions of iOS and iPadOS rolling out this fall have “new applications of cryptography to help limit the spread of CSAM [child sexual abuse material] online, while designing for user privacy.”

The project is also detailed in a new “Child Safety” page on Apple’s website. The most invasive and potentially controversial implementation is the system that performs on-device scanning before an image is backed up in iCloud. From the description, scanning does not occur until a file is getting backed up to iCloud, and Apple only receives data about a match if the cryptographic vouchers (uploaded to iCloud along with the image) for a particular account meet a threshold of matching known CSAM.

Restrictions that are included to protect privacy -:

  1. Apple does not learn anything about images that do not match the known CSAM

Database.

  • Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.
  • The risk of the system incorrectly flagging an account is extremely low. In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy.
  • Users can’t access or view the database of known CSAM images.
  • Users can’t identify which images were flagged as CSAM by the system

Alongside the new measures in iCloud Photos, Apple added two additional systems to protect young iPhone owners at risk of child abuse. The Messages app already did on-device scanning of image attachments for children’s accounts to detect content that’s potentially sexually explicit. Once detected, the content is blurred and a warning appears. A new setting that parents can enable on their family iCloud accounts will trigger a message telling the child that if they view (incoming) or send (outgoing) the detected image, their parents will get a message about it.

Apple is also updating how Siri and the Search app respond to queries about child abuse imagery. Under the new system, the apps “will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”

“At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe,” the company says on a new page on its website dedicated to child protection.

“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM),” it adds.

Follow and connect with us on FacebookLinkedIn & Twitter