WhatsApp refuses to use Apple’s solution that sifts through photos on iPhone | Applications and Software

WhatsApp’s chief executive, Will Cathcart, went on Twitter to criticize the tool developed by Apple for digging through photos on the iPhone and finding files related to child sexual abuse — what the company names as CSAM. In a series of posts, Cathcart assesses that the company’s measure of scanning all photos for this type of content is “full of problems”.


WhatsApp CEO says messenger won’t adopt Apple reader (Image: Jeso Carneiro; Flickr)

In his first publication, Will Cathcart gets straight to the point: “people asked if we at WhatsApp are going to adopt this system. The answer is no”. Social networking is known for putting privacy first, not least because the secret to its successful formula is called end-to-end encryption.

While stressing that CSAM materials must be fought, the executive believes Apple’s approach to creating a file reader to identify illegal content is “worrying”.

Apple says iCloud reader maintains privacy

But the iPhone maker claims the process is secure and preserves the private identity of its customers. Apple further warrants that the reader does not store common file information; and the chances of him flagging material incorrectly are “1 in 1 trillion” according to the company.

The CSAM combat scanner works like this: the image is read by the system based on a code bank provided by the US National Center for Missing and Exploited Children, before it is stored in iCloud. The reader looks for the code in the image and compares it with the database to verify that the file contains the common hash with content related to child abuse — its encryption is maintained during the process.

If there is a match with the database, the reader encodes the positive result. Apple says it can’t have access to this result until the iCloud user crosses a threshold — which the company hasn’t set publicly — for hits.

Reader can be used by governments, says CEO of WhatsApp

Carthcart states that, as it is an Apple system, the file reader would have to adjust according to the legislation of each country in which it is implemented:

“This is an operational surveillance system that could easily be used to scan private content to detect anything they themselves or a government wants to control. Countries where iPhones are sold will have different definitions of what is acceptable.”

The WhatsApp executive cites China as an example of a possible improper adjustment of the scan done by Apple. Another issue raised by Cathcart is the vulnerability of the system: recently, backdoors in iPhone apps such as Fotos and iMessage were exploited by the spyware Pegasus, used to monitor the cell phones of journalists, activists and even heads of state.

Will Cathcart, head of WhatsApp (Image: Disclosure/Facebook)

Will Cathcart, head of WhatsApp (Image: Disclosure/Facebook)

Cybersecurity expert points out use for “scams”

Johns Hopkins University cybersecurity professor Matthew Green pointed out on Twitter that the scan reading pattern cannot be checked by Apple’s consumer. In other words, no one other than the company itself knows how the images are being read and what the basis for reading files is.

In addition, the expert points out that the hash common to CSAM content can be used to ban someone else by mistake. For example, when using a photo of a politician, code can be included to ban the recipient.

“Imagine if someone sends you a picture about harmless politics and you share that picture with a friend? But what if that image has a hash like a known child pornography file?” – @matthew_d_green

Finally, the CEO of WhatsApp points out that Apple should protect users from backdoors that could be exploited by the government. He paraphrased a post made by the company on its official blog in 2016, where it says it will not give in to pressure from governments to use iPhone apps. “Those words were wise at the time and deserve to be considered now,” says Will Cathcart.

With information: Mashable

Leave a Comment