Report flags AI for hate and surveillance before India’s AI Impact Summit 2026

New Delhi: A new report warns of India’s faltering AI governance as the India AI Impact Summit 2026 approaches in the capital, spotlighting gaps between promises of inclusive tech and its real-world harms to minorities.

Titled India AI Impact Summit 2026: AI Governance at the Edge of Democratic Backsliding, the document from the Center for the Study of Organized Hate (CSOH) and Internet Freedom Foundation (IFF) details AI’s role in fueling communal narratives. It cites the Bharatiya Janata Party (BJP), including a recent Assam unit X post—just days before the summit—featuring an AI-generated video of Chief Minister Himanta Biswa Sarma shooting at two Muslim-appearing men titled “No Mercy.” One figure used a morphed image of opposition leader Gaurav Gogoi in a skullcap; the video was deleted amid backlash.

Beyond propaganda, the report exposes unchecked AI surveillance by law enforcement, including facial recognition, predictive policing, and tools without oversight or judicial approval. Maharashtra Chief Minister Devendra Fadnavis revealed plans for an IIT Bombay-collaborated AI system to detect “illegal Bangladeshis” and Rohingya refugees via speech patterns, risking targeting of Bengali-speaking Muslims and migrant workers from Assam and West Bengal. It references cases of such citizens deported to Bangladesh without due process.

India’s November 2025 AI Governance Guidelines draw criticism for relying on voluntary compliance, favoring innovation over safeguards, and ignoring risks to religious minorities, Dalits, Bahujans, Adivasis, and LGBTQ+ groups. Affected communities lack mandatory transparency to challenge harms.

The summit, themed “Democratizing AI and Bridging the AI Divide” under “People, Planet and Progress,” prompts calls for action: enforceable lifecycle regulations, bans on mass surveillance and predictive policing, independent public-sector oversight, community inclusion, and binding rules over voluntary pledges.

Tags: