Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
Trump
access_time 22 Nov 2024 2:47 PM GMT
election commmission
access_time 22 Nov 2024 4:02 AM GMT
Champions Trophy tournament
access_time 21 Nov 2024 5:00 AM GMT
The illness in health care
access_time 20 Nov 2024 5:00 AM GMT
The fire in Manipur should be put out
access_time 21 Nov 2024 9:19 AM GMT
America should also be isolated
access_time 18 Nov 2024 11:57 AM GMT
DEEP READ
Munambam Waqf issue decoded
access_time 16 Nov 2024 5:18 PM GMT
Ukraine
access_time 16 Aug 2023 5:46 AM GMT
Foreign espionage in the UK
access_time 22 Oct 2024 8:38 AM GMT
exit_to_app
Homechevron_rightTechnologychevron_rightApple's plan to scan...

Apple's plan to scan iPhones for child sexual abuse images draws mixed responses

text_fields
bookmark_border
Apples plan to scan iPhones for child sexual abuse images draws mixed responses
cancel

With an intention to curb child sexual abuse, Apple has decided to scan photos stored in the libraries on iPhones in the US, drawing mixed responses. While child protection groups welcomed the move, privacy campaigners warn the practice could have dangerous ramifications. The decision is not confined to scanning photos but the contents of end-to-end encrypted messages will also be subject to scrutiny which Apple is said to be attempting is for the first time.

The images will be made subject to scanning using Apple's tool called neuralMatch before a user upload it to Apple's iCloud Photos online storage after comparing the photos with the known child abuse imagery and if found the images stored are matching with the criteria, the Apple staff would review the images and disable the user's account after confirming the images' abuse nature. They will also send a notification to National Center for Missing and Exploited Children, according to a report published in The Guardian.

The matching tool only detects images, leaving no space for worry for parents who take images of children in the bath but experts warn that the matching tool that detects abusive images using mathematical fingerprints could be put in use for other purposes.

Experts also red-flagged that the tool could be used to implicate innocent people in the child abuse crime by sending them seemingly innocuous images at the same time meeting the criteria to be deemed as abusive.

Besides, the matching tool can be a handy weapon for the government to monitor dissidents or protesters that the company could not deny if the government had made such a demand.

Though the technology using the digital fingerprints of known child sexual images have been sharing by major tech companies, including Google, Facebook and Microsoft for years and Apple used the tech to scan user files stored in its iCloud service, the latest move is said to be the first time a tech company scanning on-device.

Alongside the neuralMatch technology, Apple plans to scan users' encrypted messages as they are sent and received using iMessage. An AI-based tool will attempt to automatically identify sexually explicit images, enabling parents to turn on automatic filters for their children's inboxes.

Show Full Article
TAGS:AppleChild Sex AbusePhoto Scanning
Next Story