Child Abuse

Apple confirms existing iCloud Photos will be scanned for child abuse

Apple child abuse

Apple announced that it will implement a mechanism that will check photos taken on iPhones in the US before they are uploaded to its iCloud storage services to guarantee that they do not match known images of child sexual abuse. 

Apple indicated that detecting child abuse image uploads sufficiently to avoid false positives will result in a human evaluation of the user and a complaint to law enforcement. The method is believed to be designed to minimize false positives to one in a trillion, according to the company. 

Apple’s new approach aims to respond to requests from law authorities to assist in the prevention of child sexual abuse while also upholding the company’s basic values of privacy and security. However, some privacy advocates have expressed concerns that the method could allow for the monitoring of political speech or other content on iPhones. 

Read more: Everything Apple announced at its WWDC 2021 keynote

Most other major technology companies, such as Alphabet Inc’s Google, Facebook Inc, and Microsoft Corp, are already running photographs through a database of known child sexual abuse images. 

“With so many people using Apple products, these new safety measures have lifesaving potential for children who are being tempted online and whose horrendous photographs are being distributed in child sexual exploitation material,” said John Clark, CEO of the National Center for Missing and Exploited Children. “The reality is that child protection and privacy can coexist.” 

This is how Apple will scan photos stored on iPhones and iCloud for child abuse

This is how Apple operating system works, officials keep a database of known photographs of child sexual abuse and convert them into “hashes,” which are numerical numbers that positively identify the image but cannot be used to rebuild it. 

Apple built the database with a technology called “NeuralHash,” which is designed to catch modified photographs that are identical to the originals. iPhones will be used to store the database. 

When a user uploads an image to Apple’s iCloud storage service, the iPhone generates a hash of the image and compares it to a database. 

Apple claims that only photos stored on the phone are reviewed, and that human review before reporting an account to law enforcement is done to ensure that any matches are legitimate before suspending an account. 

Users who believe their account was suspended inadvertently can file an appeal to have it restored, according to Apple. 

Some features of the programme were previously disclosed by the Financial Times. 

One element that distinguishes Apple’s technology is that it examines images kept on phones before they are uploaded, rather than after they have arrived on the company’s servers. 

Some privacy and security experts expressed concerns on Twitter that the system could be expanded to scan phones more broadly for forbidden information or political expression in the future. 

Most Popular

To Top