Apple will scan iPhones to fight against child pornography, the firm is accused of spying

Apple plans to scan iPhones, iPads and iClouds for child pornography. Concretely, the Californian giant will use algorithms to detect signs of sexual abuse involving children. Many privacy advocates fear this new feature may be diverted from its original purpose.

This Thursday, August 5, 2021, Apple announced a host of new features aimed at stemming the spread of child pornography videos on iOS 15. The Cupertino company will now identify images of a sexual nature involving children on its iPhone, iPad and its iCloud server in the USA.

Photos stored on iCloud or exchanged by iMessage will be analyzed by algorithms. These algorithms will compare the unique digital signature of the image with digital signatures of pedophile photos contained in a database. To power its system, Apple will rely on photos provided by the National Center for Missing and Exploited Children (NCMEC).

Also Read: iOS 15 Automatically Removes Stray Light Effects From Photos

Edward Snowden criticizes new iOS 15 measures

If the signature matches, the photo will be tagged. Once an account exceeds a certain threshold of marked shots, an algorithm will access the photograph to check for the presence of objectionable content. An employee can then take a look at the image. In the event of illegal content, the account will be reported to the authorities.

In the same vein, Apple will keep an eye on the images exchanged by children’s accounts linked to a family subscription. If the child submits a sexual image, they will receive a warning from iOS. Apple also reserves the right to warn parents. “We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the dissemination of child pornography ”, Apple explains in a press release.

These measures drew a flurry of criticism. For many privacy advocates, Apple is going too far despite its laudable intentions. “Apple is replacing its end-to-end encrypted messaging system with a surveillance and censorship infrastructure, which will be vulnerable to abuse and abuse not only in the United States, but around the world ”, regrets the Center for Democracy and Technology (CDT).

For the CDT, Apple knowingly integrates a backdoor to his iPhones. “The mechanism that will allow Apple to scan images into iMessages is not an alternative to a backdoor – it’s a backdoor. Client-side analysis at one end of the communication breaks the security of the transmission and informing a third party (the parent) of the content of the communication is an invasion of privacy ”, tackle the organism.

For his part, whistleblower Edward Snowden fears that the system put in place by Apple will eventually be exploited for other purposes. “No matter how good the intentions are, Apple is rolling out mass surveillance around the world. Make no mistake: if they can search child porn today, they can search anything tomorrow ”, warns Snowden.

Leave a Comment