Apple Scans Emails on iCloud to Fight Child Abuse | Antivirus and Security

The controversy surrounding the tool to scour photos of the iPhones It’s from iCloud to combat child abuse won another episode. THE apple confirmed that he has analyzed e-mails from the platform to identify Child Abuse Material (CSAM) for a few years. The practice, however, was not applied to photos and backups from the cloud service. The information was revealed by 9to5Mac this Monday (23).


iPhone 12 Pro Max Camera (Image: Paulo Higa/Tecnoblog)

The confirmation stems from a question due to a recent statement on the matter. Last week, an executive at the iPhone maker said its products are “the best platform for distributing child pornography.” Then the website contacted the company to find out how they actually know about it.

To 9to5Mac, Apple confirmed that it checks the attachments of messages sent and received from iCloud since 2019. The practice takes place in the email service to find traces of CSAM, as proposed for the photo platform. But this isn’t the only way: the company even said it does a limited scan of other data, on a “tiny scale”, without specifying what kind of information that would be.

In addition, an archived Apple document known as “Our Commitment to Child Safety” gave more details about the practice:

“As part of this commitment [para combater o abuso de menores], Apple uses image matching technology to help find and report child exploitation. Like email spam filters, our systems use electronic signatures to locate suspected child exploitation. We validate each combination with individual review”.

Still, the iPhone maker pointed out to the specialized site that it never scanned photos and backups stored on iCloud.

iCloud Photos (Image: Disclosure/Apple)
iCloud Photos (Image: Disclosure/Apple)

Apple to review photos from iCloud in the US; understand the case

The solution to finding CSAM surfaced in early August. At the time, the company announced a system capable of sifting through iCloud photos to monitor child pornography in order to combat child abuse. This would be possible thanks to algorithms trained to identify this type of content stored on the platform.

The initiative, however, has become the target of criticism and concerns since its inception. According to Apple, the feature is secure and preserves users’ privacy. In addition, the system only handles content related to the database provided by the US National Center for Missing and Exploited Children.

But this was not enough to give everyone absolute confidence. After all, all content stored in iCloud Photos will be scanned by the tool.

One of the alerts comes from cryptographic expert Matthew Green. On Twitter on the 4th, he pointed out that the user does not have access to the database used by the company. Furthermore, there is a risk that the system will report a false negative or even be used by malicious people.

Other inquiries came from the head of WhatsApp, Will Cathcart. The executive stated, on August 6, that the reader can be used by governments. He also pointed out that the solution for scouring photos will not be implemented on Facebook’s messenger.

iPhone 12 Pro (Image: Alwin Kroon/Unsplash)
iPhone 12 Pro (Image: Alwin Kroon/Unsplash)

Apple could “have been clearer”, says executive

Apple’s senior vice president of software engineering Craig Federighi spoke about the case a few days later. He claimed that Apple could “have been clearer” about the system for analyzing photos from iCloud. The executive also stressed that the system is fully auditable, that it does not affect the privacy of users and that the feature will only trigger alerts if it identifies more than 30 suspicious photos.

“It is quite clear that many messages were confused, and the functioning of the system was misunderstood,” he declared. “We would have liked this to have been a little clearer to everyone, because we feel happy and strong about what we are doing.”

During the tool’s announcement, Apple stated that the system for analyzing iCloud photos would initially be implemented in the United States. But the company also plans to take it to more countries in the future.

With information: 9to5Mac

Leave a Comment