Apple will scan US iPhones for photos of kid sexual abuse: NPR

This May 21, 2021 photo shows the Apple logo displayed on a Mac Pro desktop computer in New York. Apple plans to scan US iPhones for images of child abuse, which has garnered applause from child protection groups, but security researchers fear the system could be abused by governments that want to monitor their citizens. Mark Lennihan / AP hide caption

Toggle caption

Mark Lennihan / AP

This May 21, 2021 photo shows the Apple logo displayed on a Mac Pro desktop computer in New York. Apple plans to scan US iPhones for images of child abuse, which has garnered applause from child protection groups, but security researchers fear the system could be abused by governments that want to monitor their citizens.

Mark Lennihan / AP

Apple revealed plans to scan US iPhones for images of child sexual abuse, which garnered applause from child protection groups, but some security researchers feared the system could be abused, including by governments that want to monitor their citizens.

The tool called “neuralMatch”, which detects known pictures of child sexual abuse, scans pictures before they are uploaded to iCloud. If a match is found, a human checks the image. If child pornography is confirmed, the user’s account will be deactivated and the National Center for Missing and Exploited Children will be notified.

Regardless, Apple plans to scan users’ encrypted messages for sexually explicit content for child safety reasons, which is also alarming privacy advocates.

The recognition system only marks images that are already in the database of the Center for Known Child Pornography. Parents who take innocent photos of a child in the bathtub probably don’t need to worry. However, researchers say the matching tool – which does not “see” such images, just mathematical “fingerprints” that represent them – could be used for more nefarious purposes.

Matthew Green, a leading cryptography researcher at Johns Hopkins University, warned that the system could be used to scam innocent people by sending them seemingly harmless images aimed at triggering matches for child pornography. That could fool Apple’s algorithm and alert law enforcement agencies. “Researchers could do this pretty easily,” he said of the ability to trick such systems.

Apple introduces important new data protection regulations for iPhones and iPads

Other abuses could include government surveillance of dissidents or protesters. “What if the Chinese government says, ‘Here is a list of files to look for,'” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Apple has been under pressure to allow increased surveillance of encrypted data

Tech companies like Microsoft, Google, Facebook, and others have shared digital fingerprints of familiar images of child sexual abuse for years. Apple used this to scan user files stored in its iCloud service, which is not as securely encrypted as its data on the device, for child pornography.

Apple has been under government pressure for years to allow increased surveillance of encrypted data. To put the new security measures in place, Apple had to strike a delicate balancing act between fighting child exploitation and maintaining its high-profile commitment to protecting the privacy of its users.

But a dejected Electronic Frontier Foundation, the online civil liberties pioneer, called Apple’s privacy compromise “a shocking U-turn for users who have relied on the company’s leadership in privacy and security.”

The computer scientist who invented PhotoDNA more than a decade ago, the technology used by law enforcement to identify child pornography on the Internet, recognized the potential for abuse of the Apple system, but said it was driven by the need to prevent sexual abuse fighting by children was far outweighed.

“Is it possible? Of course. But is it something I worry about? No,” said Hany Farid, a researcher at the University of California at Berkeley, who argues that many other programs have been developed to protect devices from various threats to protect have not yet seen “this type of missionary creep”. WhatsApp, for example, offers users end-to-end encryption to protect their privacy, but also uses a malware detection system and warns users not to click on malicious links.

Apple was one of the first major companies to introduce “end-to-end” encryption, which encrypts messages so that only their senders and recipients can read them. However, law enforcement agencies have long put pressure on the company to gain access to this information to investigate crimes such as terrorism or the sexual exploitation of children.

Apple said the latest changes will be introduced this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

“Apple’s expanded protection for children is a turning point,” said John Clark, president and CEO of the National Center for Missing and Exploited Children, in a statement. “With so many people using Apple products, these new safety measures have life-saving potential for children.”

Apple says the changes won’t break user privacy

Julia Cordua, CEO of Thorn, said Apple’s technology “balances the need for privacy with digital security for children.” Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to protect children from sexual abuse by identifying victims and partnering with technology platforms.

But in heavy criticism, the Washington-based nonprofit Center for Democracy and Technology asked Apple to abandon the changes, effectively undermining the company’s end-to-end encryption guarantee. Searching messages on phones or computers for sexually explicit content is an effective security breach, it said.

The organization also questioned Apple’s technology to distinguish between dangerous content and something as tame as art or a meme. Such technologies are notoriously prone to failure, CDT said in a statement sent via email. Apple denies the changes create a backdoor that compromises encryption. It states that these are carefully thought-out innovations that do not disturb the privacy of users, but rather protect them strongly.

Separately, Apple said its messaging app will use on-device machine learning to detect and blur sexually explicit photos on kid’s phones, and can also send text messages to parents of younger children. It also said that its software would “step in” if users tried to search for topics related to child sexual abuse.

In order to receive the warnings of sexually explicit images on their children’s devices, parents must register their child’s phone. Children over the age of 13 can opt out, which means that parents of teenagers will not receive notifications.

Apple said none of the features would compromise the security of private communications or notify the police.

Comments are closed.