WhatsApp CEO Will Cathcart says the company will not adopt the same CSAM scanning system coming to iOS 15 as announced by Apple earlier this week, but some of his claims are being refuted by Apple.
Will Cathcart, noting Apple's plans said that he was "concerned" by Apple's recently announced plans to scan photos uploaded to iCloud Photos for Child Sexual Abuse Material (CSAM), matching the hashes of photos to a known database of CSAM images provided by the NCMEC and other organizations that protect children.
Cathcart says Apple's plan is "the wrong approach and a setback for people's privacy all over the world," he further noted that people were asking if WhatsApp would adopt this system for WhatsApp, he said no.
Cathcart noted some of WhatsApp's own work in the sector and said that Apple "has long needed to do more to fight CSAM", however, criticized Apple's approach, which was announced earlier this week, stating:
Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy.
Cathcart described the plan as "an Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control." Cathcart also claimed that "countries, where iPhones are sold, will have different definitions on what is acceptable." Cathcart noted concerns about how the system could be used in China, and what might be considered illegal there.
Apple has confirmed to iMore that Cathcart's comments miss some of the key facts regarding its new system. Cathcart states that Apple "has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone." Apple denies this and says that its system can only detect CSAM images in iCloud Photos and that if a user turns off iCloud Photos that it simply won't work. It also can't detect any images that aren't known CSAM images, as noted by one of the project's technical validators:
Cathcart further argued that "this is an Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control." As per the details of the announcement, Apple says that the CSAM image hashes come from the NCMEC, as well as other child safety organizations, not Apple, and that nowhere in the process can Apple add anything to the set of hashes it is given. As the unreadable hash list will be part of the operating system, every device has the same set of hashes. Furthermore, Apple refuted Cathcart's concern about other countries where iPhones are sold, stating that there's no way to modify or change the system for any specific region or device.
Apple's recently-announced measures have proven controversial, with some security experts and privacy advocates opposing the move. You can read the announcement from earlier this week here.