Apple defends privacy of new tools to prevent child sexual abuse

PHOTO: Apple responds to privacy concerns. (via Pixabay)

Facing backlash for the violation of privacy a new update it proposed last week would create, Apple is defending its plan to roll out updates that will detect and report photos of child sexual abuse in cloud services or messaging. Software chief Craig Federighi said in an interview with The Wall Street Journal yesterday that the project has been widely misunderstood.

The new update to Apple’s mobile devices like iPhones and iPads would include 2 new features in the United States. The first will analyse and identify any photos uploaded to Apple’s iCloud storage service that picture child sexual abuse. The second update will use artificial intelligence machine learning on a child’s device to recognise sexually explicit photos sent or received in Messages, Apple’s messaging app, and warn the child as well as their parent.

Advertisements

While protecting children from sexual abuse is something nearly universally supported, Apple drew significant backlash from privacy advocates and users concerned about the possibilities of abuse of the technology used in this protection. But the Silicon Valley tech company says that security and confidentiality are not affected by these new features.

In fact, Federighi says that Apple’s goal was to develop ways to offer this protection with more privacy than ever possible before and without looking at people’s photos. Apple released detailed technical explanations about the technology explaining that it was designed by cryptographic experts with the specific goal of preserving privacy.

Related news

The images will use AI to analyze photos without any human setting eyes on it, and any images determined to violate child sexual abuse laws would be transmitted straight to the non-profit organisation the National Centre for Missing and Exploited Children. Apple would not be setting the parameters of this analysis, but rather leave the determination to a coalition of trusted groups internationally that would also protect from the tech being abused to violate privacy.

The technology mainly focuses on automatically comparing specifications of uploaded photos to those registered in a database of known child sexual abuse images to find a match without actually viewing the images.

Privacy and encryption experts argue that however valiant motivations might be, an update like this begins to allow backdoors to Apple user privacy that could later be exploited by hackers or the government for mass surveillance or individual access. Apple has stood up to previous attempts by governments to demand a way to access private user data and even has a page on their website dedicated to committing to never allow backdoor access to user data:

Advertisements

“Apple has never created a backdoor or master key to any of our products or services. We have also never allowed any government direct access to Apple servers. And we never will.”

SOURCE: Bangkok Post

Technology News

Neill Fronde

Neill is a journalist from the United States with 10+ years broadcasting experience and national news and magazine publications. He graduated with a degree in journalism and communications from the University of California and has been living in Thailand since 2014.

Related Articles