Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
DEEP READ
Munambam Waqf issue decoded
access_time 16 Nov 2024 10:48 PM IST
Ukraine
access_time 16 Aug 2023 11:16 AM IST
Foreign espionage in the UK
access_time 22 Oct 2024 2:08 PM IST
Netanyahu: the world’s Number 1 terrorist
access_time 5 Oct 2024 11:31 AM IST
exit_to_app
Homechevron_rightTechnologychevron_rightApple's plan to scan...

Apple's plan to scan iPhones for child sexual abuse images draws mixed responses

text_fields
bookmark_border
Apples plan to scan iPhones for child sexual abuse images draws mixed responses
cancel

With an intention to curb child sexual abuse, Apple has decided to scan photos stored in the libraries on iPhones in the US, drawing mixed responses. While child protection groups welcomed the move, privacy campaigners warn the practice could have dangerous ramifications. The decision is not confined to scanning photos but the contents of end-to-end encrypted messages will also be subject to scrutiny which Apple is said to be attempting is for the first time.

The images will be made subject to scanning using Apple's tool called neuralMatch before a user upload it to Apple's iCloud Photos online storage after comparing the photos with the known child abuse imagery and if found the images stored are matching with the criteria, the Apple staff would review the images and disable the user's account after confirming the images' abuse nature. They will also send a notification to National Center for Missing and Exploited Children, according to a report published in The Guardian.

The matching tool only detects images, leaving no space for worry for parents who take images of children in the bath but experts warn that the matching tool that detects abusive images using mathematical fingerprints could be put in use for other purposes.

Experts also red-flagged that the tool could be used to implicate innocent people in the child abuse crime by sending them seemingly innocuous images at the same time meeting the criteria to be deemed as abusive.

Besides, the matching tool can be a handy weapon for the government to monitor dissidents or protesters that the company could not deny if the government had made such a demand.

Though the technology using the digital fingerprints of known child sexual images have been sharing by major tech companies, including Google, Facebook and Microsoft for years and Apple used the tech to scan user files stored in its iCloud service, the latest move is said to be the first time a tech company scanning on-device.

Alongside the neuralMatch technology, Apple plans to scan users' encrypted messages as they are sent and received using iMessage. An AI-based tool will attempt to automatically identify sexually explicit images, enabling parents to turn on automatic filters for their children's inboxes.

Show Full Article
TAGS:AppleChild Sex AbusePhoto Scanning
Next Story