Apple to add child sexual abuse protections to mobile devices
In an effort to protect children from predators, Apple will update iPhones and iPad to detect and report images of child sexual abuse. The megacompany vowed yesterday to implement new detection software to automatically flag any photos that involve children being abused and exploited as soon as they are uploaded into iCloud, Apple’s proprietary cloud-based storage service.
Photos will be reported to the National Centre for Missing and Exploited Children after the update is launched that will modify the operating systems of iPhones and iPad. The ubiquitous tech company released a statement announcing the new plan to monitor photos on their devices acknowledging that communication tools such as their devices are used by predators to recruit and exploit children.
Apple aims to limit the spread of child sexual abuse material by implementing this new automated technology. They say they will launch the detection as part of a suite of new tools coming for all their mobile devices. Siri, the AI personal assistant included with Apple devices will be programmed to step in when users search for terms related to child sex abuse.
iPhone’s messaging app will also use artificial intelligence to learn to recognize dangerous messages and explicit or nude photos and send warning messages to the child involved and notify their parents as well.
No specific release date was announced. The use of AI and monitoring users actions and data are often controversial and call into question where to draw the line in the balance between public and personal safety versus personal privacy for users.
SOURCE: Thai PBS World
Technology News