Apple's Child Safety measures draw flak from Big Tech, experts
Apple recently announced sweeping changes to its policies with the intention to keep children safer while using Apple ecosystem products. While the intention is noble, the implementation method has drawn criticism from eminent personalities, including WhatsApp boss Will Cathcart, privacy advocate Edward Snowden, and American political figures. An online petition asking Apple to "reconsider its technology rollout" amassed 5,000+ signatures at the time of publishing.
Messages app will use on-device machine learning to identify CSAM
According to the Communications Safety section of Apple's Child Safety measures, new communication tools will optionally alert parents if their child under 13 years of age sends or views Child Sexual Abuse Material (CSAM). The images will be deemed as CSAM based on an on-device analysis that involves comparing hashes of identified CSAM to hashes of iCloud images, thereby giving rise to privacy concerns.
Siri, Search will also restrict users' access to CSAM content
Apple claims that it won't be able to read the encrypted communications. Additionally, the company's policy changes state that Apple would provide law enforcement agencies information on collections of CSAM in iCloud photos. Siri and Search will also prevent Apple users from searching for CSAM-related topics, besides providing children and parents help if they encounter such content online.
What if the tool starts looking for other material?
The Apple Privacy Letter petition noted that Apple's policy implementation would be undoing "decades of work by technologists, academics, and policy advocates." An internal Apple memo even acknowledged that people would be "worried about the implications" of the system. People's concern isn't about Apple's intentions, but about false accusations, and pressure from government/private entities that could convert the policy into an overreaching snooping tool.
Facebook-owned WhatsApp didn't spare the opportunity to bash Apple
WhatsApp boss Will Cathcart tweeted that the messaging service won't be adopting the safety measures. He termed Apple's approach "very concerning." He laid emphasis on the Facebook-owned WhatsApp's system that helped it report 400,000+ cases of child exploitation in 2020. Facebook's subsidiary grabbed the opportunity to bash Apple since the latter's advertising policy took a significant toll on the social media company's advertising revenue.
Electronic Frontier Foundation's statement thoroughly blasted Apple's plan
Apple's filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change. https://t.co/f2nv062t2n— EFF (@EFF) August 5, 2021
NSA whistleblower Snowden's take on Apple's Child Safety measures
Apple plans to modify iPhones to constantly scan for contraband:— Edward Snowden (@Snowden) August 5, 2021
“It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops,” said Ross Anderson, professor of security engineering. https://t.co/rS92HR3pUZ
Politician Brianna Wu called this 'worst idea in Apple history'
This is the worst idea in Apple history, and I don't say that lightly.— Brianna Wu (@BriannaWu) August 5, 2021
It destroys their credibility on privacy. It will be abused by governments. It will get gay children killed and disowned. This is the worst idea ever. https://t.co/M2EIn2jUK2
Following Epic vs Apple suit, Epic CEO also shared thoughts
I spent another hour searching, and there indeed appears to be no way to delete a useless at-icloud-dot-com email account that Apple used to force everyone to take, except for deleting your entire Apple account and losing everything you've bought. Lock-in indeed! pic.twitter.com/JT5KOta5jA— Tim Sweeney (@TimSweeneyEpic) August 7, 2021
Apple memo called criticism 'the screeching voices of the minority'
Snowden retweeted a post by Electronic Frontier Foundation (EFF) director of cybersecurity Eva Galperin. The post claimed to show an internal Apple memo from August 7 that called the backlash "the screeching voices of the minority." On Twitter, Galperin explained that today the database scans iCloud for CSAM images, but tomorrow it could be anything else, such as "memes critical of the Chinese government."
A disturbing internal Apple memo circulated yesterday
Apple distributed this internal memo this morning, dismissing their critics as "the screeching voices of the minority."— Eva (@evacide) August 6, 2021
I will never stop screeching about the importance of privacy, security, or civil liberties. And neither should you. pic.twitter.com/lLDfxEUIXL
Apple will push these changes with iOS 15, macOS Monterey
If Apple chooses to proceed with the implementation of these new policies, it will do so with an upcoming software update for devices across its ecosystem, including those running iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. A timeline for the rollout remains unknown.