Ever since Apple introduced the new protection tools for child safety this week, it instantly divided opinions. While some think this is a huge deal to protect children, others believe it will just create a backdoor for governments to access people’s iPhones.
Now, WhatsApp CEO Will Cathcart is the latest to join those who think the new Child Safety tools from Apple could be bad.
It’s not the first time Cathcart criticized Apple. A couple of weeks ago, WhatsApp CEO called out Apple about the NSO malware in an interview with the Guardian and said the company should “be loud, join in” rather than saying this won’t affect many of its users.
With another controversy rising, Will Cathcart thinks the approach Apple is taking “introduces something very concerning into the world” and that WhatsApp won’t adopt something similar in its system. Although it’s important to keep in mind that it has been reported that Facebook wants to be able to read people’s messages on WhatsApp for targeted ads.
Here’s what Will Cathcart said:
“Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world. Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.
(…) Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people’s privacy? What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?
Although Cathcart is another big voice against Apple in this case, joining others such as Edward Snowden and the Electronic Frontier Foundation, an internal memo obtained by 9to5mac shows Apple addressing concerns around new Photo scanning:
Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR. What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple’s deep commitment to user privacy.
We’ve seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we’ve built. And while a lot of hard work lays ahead to deliver the features in the next few months, I wanted to share this note that we received today from NCMEC. I found it incredibly motivating, and hope that you will as well.
What do you think of Apple’s announcements around expanded protections for child safety? Let us know down in the comments.
- Apple announces new protections for child safety: iMessage features, iCloud Photo scanning, more
- Comment: Apple’s child protection measures get mixed reactions from experts
- In internal memo, Apple addresses concerns around new Photo scanning features, doubles down on the need to protect children
- Apple says any expansion of CSAM detection outside of the US will occur on a per-country basis
FTC: We use income earning auto affiliate links. More.