Apple photo analysis ‘opens doors to threats’, says expert

Apple’s announcement last week of a verification tool that will look for Child Abuse Material (CSAM) has caused quite a bit of controversy. Despite the attempt to combat the circulation of sexual content with children, experts say that the platform needs adjustments to safeguard privacy and guarantee correct use.

O TechWorld talked about this and other details on the subject with Marcos Barreto, a professor at the Polytechnic School of the University of São Paulo (POLI-USP), a specialist in cybersecurity and a researcher in areas such as system architectures for critical and real-time applications.

He explains that the Apple initiative is commendable, as technology companies of this size have the ability to act in the fight against cybercrime.

“If we think of it as a new technology, it is very cool. [Os desenvolvedores da Apple] they used non-trivial solutions to do the right thing, since the possession and distribution of sexual materials with children is a serious crime almost all over the world,” he stresses.

The researcher also argues that, by delimiting the checking of suspicious images to iCloud, the company ensures that users can continue using its other services if they don’t feel comfortable. He remembers, however, the importance of all this being described in terms of use of the cloud platform.

Negative points

Just a day after Apple unveiled the technology, an open letter signed by more than 6,000 people asked the company to reconsider the initiative. The document, which even included the signature of illustrious digital security industry figures such as Edward Snowden, says that despite good intentions, the premise is quite dangerous.

“Apple’s proposal opens the door to threats that undermine fundamental privacy protections for all users.” The solution, which will also be used in the Messaging app, could establish a “precedent where our personal devices become a radical new tool for invasive surveillance, with little oversight to prevent eventual abuse and irrational expansion of the surveillance scope,” according to the signatories.

Despite previous praise, Barreto also makes notes on matters that were not dealt with in a transparent manner by the brand, which even admitted concern with the tool.


“Although it is not easy, I believe that an audit process would be important. It is necessary to ensure that the technology is being used, in fact, for what it is intended. Like practically everything else in the world, the tool could have its purpose diverted “.

The expert points out that it is not possible to guarantee with 100% accuracy that Apple will not use the CSAM detection system for other means, even if the brand guarantees this in a document.

Another criticism raised is that the process itself is questionable. According to the Cupertino giant, before suspicious material is reported to the National Center for Missing and Exploited Children (NCMEC), it will undergo a “human review”.


“The review is essential to ensure effectiveness. However, the guidelines suggest that the person who will carry out the verification is an Apple professional. This would entail a new exposure of the child, if the material is really illegal. It would be interesting if she did. [Apple] could explain this point again or simply do it another way, if this is the procedure”.

How does technology work?

Professor Barreto explains that, in a simplified way, we can say that Apple’s novelty works from three pillars. The first one is NeuralHash, which uses a neural network to transform images into numbers (called a hash). In other words, the characteristics of the photos (angle and color, for example) make up the number (which is unique) of a figure.

The math behind the technology makes images with identical content have identical hashes and also the opposite (different images have different hashes). Therefore, images that are stored in iCloud and are identical or extremely similar to those held by NCMEC will be considered suspicious and later analyzed. Apple points out that the system is extremely secure and that an incorrect flag will occur 1 time in 1 trillion reviews during each year.

The second pillar is the so-called “intersection of private sets”. Along with other encryption techniques, it ensures that the system only “learns” from images that fall within a certain hash pattern. Photos outside the established parameters are not used in the protocol.

appleApple’s example shows identical images, only different in color, that have the same hash, while an unrelated one has another number.

The POLI-USP researcher says that the last element is a data organization model that scrambles the database so as not to allow the system to be accessed by cybercriminals, and that images with specific content, such as credit card photos, by example, cannot be searched.

Can technology reach Brazil?

We asked Barreto what he thinks about a possible arrival of the verification tool in Brazil. He recalls that among the main complicating factors are the denunciation and bureaucratic issues with entities in the country.

“The technology is based on a set of images known to be child pornography. Therefore, it would be necessary to locate in the country which institution has this competence to verify the photos of Brazilians on iCloud. In addition, the entire criminal issue in relation to the complaint of the profiles would have to be thought of. The bureaucracy involved in using a system of this type makes me think that it is difficult for Apple to think, at first, of using it here.”

In Brazil, public institutions (such as the Ministry of Women, Family and Human Rights, State and Federal Police, Public Ministry) and private institutions (such as the NGO Safernet Brasil) work to combat the production and dissemination of pornographic and sexual content involving children on the Internet. To report this type of material, there is the Dial 100 communication channel, simply calling 100 to report it.

The other side

The article contacted Apple to question the critical points mentioned both in the public letter and by professor Marcos Barreto. In addition, he was asked about a possible arrival of the technology in Brazil.

However, as of this writing, Apple has not returned the contact. The text will be updated if the company responds to the inquiries.

Leave a Comment