Apple is introducing a system to detect child abuse, but is there more to it?

Tech giant Apple has announced that it will be introducing “new child safety features” to “expand protection” for children. This recent move has been met with scrutiny from some members of the public as the feature will search for matches of known child sexual abuse material (CSAM) before the image is stored onto iCloud photos. 

One of the more vocal, and prominent critics of Apple’s new feature is head of WhatsApp Will Cathcart who took to Twitter to make his stance clear. 

“I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we’ll adopt this system for WhatsApp. The answer is no,” said Cathcart 

The WhatsApp boss has made it clear that he is all for child abusers being caught and brought to justice. He said, “Child sexual abuse material and the abusers who traffic in it are repugnant, and everyone wants to see those abusers caught.” 

Cathcart explains that companies like WhatsApp have strived for years to clamp down on CSAM, and have, in fact, implemented measures to crack down on child abusers without violating the rights of everyday users. 

Carthcart says WhatsApp successfully reports CSAM without implementing measures that run the risk of violating user privacy. 

“We’ve worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it’s shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption,” he explained. 

Cathcart admits that Apple should have long taken measures to crack down on CSAM. He simply does not approve of the method. 

“Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world,” he said. 

Others are more open to this new method being implemented by Apple. One user, replying to Cathcart’s thread on Twitter, said, “This is a good move from Apple. And its only the image hash that is compared to a database of hashes related to known images of Child abuse. No private photos ever [leaves your phone!]”

Another user said a company owned by Facebook has no right to judge anyone about violating users’ privacy. “Who cares? I deleted your App the day Facebook bought the company and lied about not commingling data. FB being concerned about privacy is beyond laughable,” they said. 

According to a report by The Verge, technical assessments of the system have been produced by experts who have described it as “mathematically robust”. 

Apple has said however that other child safety groups are “likely” to be added as hash sources as the program expands. The company would not reveal its list of partners who would be utilizing this technology to potentially gain greater access to iPhone user data within America. This may concern US citizens about how this technology may be exploited by the Chinese government. 

According to reports, Apple will be rolling out this child protection feature on a country-by-country basis, depending on “local laws”.