Apple Thursday said iPhones and iPads will soon start detecting images containing child sexual abuse and reporting them as they are uploaded to its online storage in the United States, a move privacy advocates say raises concerns.“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM),” Apple said in an online post.
<script type="text/javascript" src="https://konga.postaffiliatepro.com/scripts/bo3dhla?k_id=21408448FF&k_bid=cef62f8e"></script>
New technology will allow software powering Apple mobile devices to match abusive photos on a user’s phone against a database of known CSAM images provided by child safety organizations, then flag the images as they are uploaded to Apple’s online iCloud storage, according to the compan
However, several digital rights organizations say the tweaks to Apple’s operating systems create a potential “backdoor” into gadgets that could be exploited by governments or other group
Apple counters that it will not have direct access to the images and stressed steps it’s taken to protect privacy and securit
The Silicon Valley-based tech giant said the matching of photos would be “powered by a cryptographic technology” to determine “if there is a match without revealing the result,” unless the image was found to contain depictions of child sexual abuse.Apple will report such images to the National Center for Missing and Exploited Children, which works with police, according to a statement by the company
India McKinney and Erica Portnoy of the digital rights group Electronic Frontier Foundation said in a post that “Apple’s compromise on end-to-end encryption may appease government agencies in the United States and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”y.y.s.y
Comments