How Apple Will Identify Child Abuse Material on the iPhone [CSAM] – Applications and Software

Apple will actively act in the control of criminal content, with the involvement of children. See below, how Apple will identify Child Abuse Material on the iPhone. The acronym CSAM is in English, but in free translation it would have the same meaning as “child pornography”, an international crime, both for those who produce, possess or pass on this type of harmful and criminal material.


How Apple Will Identify Child Abuse Material on the iPhone (Image: Laurenz Heymann/Unsplash)

What is CSAM?

As stated above, CSAM is an acronym in English that means Child Sexual Abuse Material – Child sexual abuse material. Although CSAM is a synonym for child pornography, the crime received this new nomenclature for professional legal typification purposes.

This type of crime is widespread worldwide, where criminals usually make use of encryption subterfuges, dark web forums, masked connection via VPN, etc.

Will actively monitoring smartphone devices, cloud storage services, of all users, reduce this type of crime, even when those involved do not usually use “common” methods for exchanging materials?

Apple unveiled its new system to identify photos containing images of child abuse. Although the reasons are indisputably correct for combating this heinous crime against children, the community has been alerted to the fact, because the methodology is, at first, “invasive” in the systems of individual devices – directly attacking the privacy of users.


The community was concerned about the invasion of “privacy” (Image: Jason Dent/Unsplash)

How Apple Wants to Identify Sensitive Material on the iPhone

So, since the users’ question is about the invasion of privacy of all iPhone holders, how does this CSAM ID work?

Personal protection for messages

Especially for Apple’s messaging app, the app will add new tools to alert kids and their parents when receiving or sending sexually explicit photos.

Upon receiving this type of content, the photo will be blurred and the child will be warned, making use of useful resources and assured that everything is fine if they do not want to see this photo. As an added precaution, the child can also be told that, to make sure he is safe, his parents will receive a message if he decides to see.

Similar protections will be activated if a child tries to upload sexually explicit photos. The child will be notified before the photo is sent and the parents can receive a message if they decide to send it.

According to Apple, iMessage app uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not have access to messages.


iMessage is the first focus of parental control (Image: Bit Medium/Dissemination)

iCloud cloud tracking

Here is the point that generated the community’s distrust of “methods” – even with the aim of preventing the dissemination of CSAM content.

Apple will make available several technologies to prevent the exchange and storage of material through iCloud, according to the company, with advanced techniques that will allow the maintenance of users’ privacy.

The unveiling of several new technologies has resulted in confusion, with many people feeling that Apple would monitor all users at all times. What is not true.


iCloud is the main target in the CSAM hunt (Image: Canva Pro/Publishing)

Correlation with known CSAM database

Apple will detect known CSAM images that are stored in iCloud Photos. This will allow Apple to report these occurrences to the National Center for Missing and Exploited Children (NCMEC) – the American body that controls and investigates this type of cybercrime. NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies in the United States.

Use of hashes to ensure the privacy of users

Instead of scanning images in the cloud, the system performs the match on the device using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.

Apple further turns this database into an unreadable set of hashes that are securely stored on users’ devices. Before an image is stored in iCloud Photos, an on-device matching process is performed for the image against known CSAM hashes.

This matching process is powered by a cryptographic technology called intersection of private sets, which determines whether there is a match without revealing the result. The device creates a cryptographic security voucher that encodes the result of the mailing along with additional encrypted data about the image.

This voucher is uploaded to iCloud Photos along with the image. Using another technology called secret limit sharing, the system ensures that the content of security vouchers cannot be interpreted by Apple unless the iCloud Photos account exceeds a known CSAM content limit.

In CSAM Apple, the threshold is set to provide an extremely high level of accuracy and guarantees less than a one-trillion-a-year chance of incorrectly flagging a particular account as a criminal.

O CSAM Apple Detection will be part of the systems iOS 15 and iPadOS 15, which will be available to users of all iPhones and iPads current – ​​from the iPhone 6S, fifth-generation iPad and beyond.

Although, theoretically, the function is available on Apple devices around the world, until now, the system will only work completely in the United States.


All devices from iPhone 6S and iPad 5 will have the feature (Image: Unsplash)

Resource Problems

If everything is done exactly as it has been published, CSAM Apple will not have problems. So, basically, if the company perfectly executes all the technology it’s offering, no one will be harmed. On the other hand, I can identify only one possibility of inconvenience, but it must be one of the cases in a trillion per year – according to Apple – to be subject to it.

By being “tagged” incorrectly, having your account suspended and being indicated as a possible cyber criminal. Which will lead the user to have to provide some explanations to justice.

To avoid this type of problem – very rare, according to Apple – it would be necessary to “flex” the automated comparison filters, but this would lead to the loss of much of the resource’s efficiency.

Another point raised is the return to the discussion, whether minors have the right to “privacy” without the direct control of parents or guardians. And this discussion falls short of the technology company.

With information: apple, Kaspersky.

Leave a Comment