X

This is not a paywall.

Register for free to continue reading.

The news sucks. But your reading experience doesn't have to. Help us improve that for you by registering for free.



Please create a password or click to receive a login link.


Please enter your password or get a login link if you’ve forgotten


Open Sesame! Thanks for registering.

First Thing, Daily Maverick's flagship newsletter

Join the 230 000 South Africans who read First Thing newsletter.

We write for you

It’s a public service and we refuse to erect a paywall and force you to pay for truth. Instead, we ask (nicely and often) that those of you who can afford to, become a Maverick Insider and help with whatever you can. In order for truth not to become a thing of the past, we need to keep going.

Currently, 18,000 (or less than 0.3%) of our brave and generous readers are members; which says a lot about their characters and commitment to our country. These people are paying for a free service in order to keep it free for everyone.

They are the true South AfriCANs.(Sorry, we couldn’t help ourselves.)

Support Daily Maverick→
Payment options

Apple to Detect, Report Sexually Explicit Child Photos...

Business Maverick

Business Maverick

Apple to Detect, Report Sexually Explicit Child Photos on iPhone

The Apple logo on a store in San Francisco, California, U.S., on Monday, April 26, 2021. Apple Inc. is increasing its U.S. investments by 20% over the next five years, allocating $430 billion to develop next-generation silicon and spur 5G wireless innovation across nine U.S. states, after outstripping its growth expectations during the pandemic. Photographer: David Paul Morris/Bloomberg
By Bloomberg
06 Aug 2021 0

Apple Inc. said it will launch new software later this year that will analyze photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities. The moves quickly raised concerns with privacy advocates.

As part of new safeguards involving children, the company also announced a feature that will analyze photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.If Apple detects a threshold of sexually explicit photos of children in a user’s account, the instances will be manually reviewed by the company and reported to the National Center for Missing and Exploited Children, or NCMEC, which works with law enforcement agencies. Apple said images are analyzed on a user’s iPhone and iPad in the U.S. before they are uploaded to the cloud.

Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. The company is using a technology called NeuralHash that analyzes images and converts them to a hash key or unique set of numbers. That key is then compared with the database using cryptography. Apple said the process ensures it can’t learn about images that don’t match the database.

The Electronic Frontier Foundation said Apple is opening a backdoor to its highly touted privacy features for users with the new tools.

“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” the EFF said in a post on its website. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

Other researchers are likewise worried. “Regardless of what Apple’s long-term plans are, they’ve sent a very clear signal,” Matthew Green, a cryptography teacher at Johns Hopkins University, wrote on Twitter. “In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.”

Critics said the moves don’t align with Apple’s “what happens on your iPhone, stays on your iPhone” advertising campaigns. “This completely betrays the company’s pious privacy assurances” wrote journalist Dan Gillmor. “This is just the beginning of what governments everywhere will demand. All of your data will be fair game. If you think otherwise, you’re terminally naive.”

Apple said its detection system has an error rate of “less than one in 1 trillion” per year and that it protects user privacy. “Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account,” the company said in a statement. “Even in these cases, Apple only learns about images that match known CSAM.”

Any user who feels their account has been flagged by mistake can file an appeal, the company said. To respond to privacy concerns about the feature, Apple published a white paper detailing the technology as well as a third-party analysis of the protocol from multiple researchers.

John Clark, president and chief executive officer of NCMEC, praised Apple for the new features.“These new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” Clark said in a statement provided by Apple.

The feature in Messages is optional and can be enabled by parents on devices used by their children. The system will check for sexually explicit material in photos received and those ready to be sent by children. If a child receives an image with sexual content, it will be blurred out and the child will have to tap an extra button to view it. If they do view the image, their parent will be notified. Likewise, if a child tries to send an explicit image, they will be warned and their parent will receive a notification.

Apple said the Messages feature uses on-device analysis and the company can’t view message contents. The feature applies to Apple’s iMessage service and other protocols like Multimedia Messaging Service.

The company is also rolling out two related features to Siri and search. The systems will be able to respond to questions about reporting child exploitation and abusive images and provide information on how users can file reports. The second feature warns users who conduct searches for material that is abusive to children. The Messages and Siri features are coming to the iPhone, iPad, Mac and Apple Watch, the company said.

Gallery

Comments - share your knowledge and experience

Please note you must be a Maverick Insider to comment. Sign up here or sign in if you are already an Insider.

Everybody has an opinion but not everyone has the knowledge and the experience to contribute meaningfully to a discussion. That’s what we want from our members. Help us learn with your expertise and insights on articles that we publish. We encourage different, respectful viewpoints to further our understanding of the world. View our comments policy here.

No Comments, yet

Please peer review 3 community comments before your comment can be posted