Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
DEEP READ
Schools breeding hatred
access_time 14 Sep 2023 10:37 AM GMT
Ukraine
access_time 16 Aug 2023 5:46 AM GMT
Ramadan: Its essence and lessons
access_time 13 March 2024 9:24 AM GMT
exit_to_app
Homechevron_rightIndiachevron_rightMeta launches digital...

Meta launches digital platform to tackle 'revenge porn' in India

text_fields
bookmark_border
Meta launches digital platform to tackle revenge porn in India
cancel

New Delhi: Meta is hoping to tackle 'revenge porn', the sharing of non-consensual intimate images of women, proactively with a new platform in India called StopNCII.org.

The platform, partnering with United Kingdom-based 'Revenge Porn helpline' will let women approach and alert possible intimate images and videos that could be uploaded to Facebook and Instagram without their consent, The Indian Express reported.

Stressing the alarming rise of such unfortunate forms of abuse and victims taking extreme steps, Meta's Global Safety Policy Director Karuna Nain said that StopNCII.org acts as a bank where victims can share 'hashes' of their visuals. Hashes are unique digital fingerprints attached to a visual that is shared.

The hash is then shared with Facebook and Instagram so that when someone tries to upload the visual, hash matches and the upload gets flagged. Meta assures that the visual will not leave the victims device, but only the hash gets uploaded.

Limitations
But the StopNCII.org website specifies that the visuals in question need to be in an intimate setting, such as the victim being naked, showing their genitals, engaging in sexual acts or poses or wearing inner wears in compromising positions. The platform is also limited to adult women over 18, while victims of child pornography cannot rely on it. Nain answers that regarding visuals on children, select NGOs are authorised and have legal cover to work on such cases, limiting StopNCII to adult women.

Also, a problem arises if the image is uploaded after tweaking the hash since the required hash-matching might not happen, and the visual would not be flagged by automatic detection. Nain said that, therefore, the victim should keep a lookout, find and upload the hash of the altered image.

Nain further said that uploading the hash does not guarantee that the visual get stopped since review teams of Facebook and Instagram will go through the flagged content and see if it violates their policies. Also, Meta does not promise a fixed time or duration to resolve such cases.

Right now, StopNCII.org is working only with Meta's platforms. Meta is hoping that other tech players join them so that it could make it easier for victims.

Show Full Article
Next Story