""

'Facial' marks of totalitarianism

National Crime Records Bureau (NCRB) under the Central home ministry has decided to introduce ‘Automated Facial Recognition System’ (AFRS).  The decision has sparked much controversy.   Critics point out different possibilities of privacy violation and police raj.    Even as it is,  some airports have introduced facial recognition system for operational convenience.   And passengers will naturally benefit from the convenience of not having to stand in queue to show their boarding pass and passport.

Government authorities have already got access to digital data about citizens – ranging  from Aadhaar to CCTV cameras.  Indian Railway has estimated an expendirure of Rs 3,000 Crore only to install 1.4 lakh CCTV cameras in its 11,000 trains and  8,500 stations.    In addition to this,  other public sector undertakings and private sector firms also are engaged in data gathering.

There are also other government entities like Central Monitoring System (CMS) and National Intelligence Grid (NatGrid) working to gather digital data.  Institutions like NETRA (Network Traffic Analysis),  working under  Center for Artificial Intelligence & Robotics also have the same sphere of activity – surveillance,  gathering personal data and using data for various purposes.    Needless to say,  the owner and beneficiary  of all this machinery is the government.  Facial recognition system is the technique in this wide and expanding chain of data collection.   It is when  technology like Face App, and many other facilities and features available via social media and the internet have opened infinite possibilities of digital surveillance,  that the government is taking a new step in this direction.  The central home ministry has invited tenders for installing facial recognition system saying that it is for modernizing the police force and its data gathering equipment.

Gathering of data in digital format has the advantage not only to modernize police action.   It is cited that in several states,  the facial recognition system (AFRS) installed in collaboration with private firms,  has reportedly helped stop crimes like child trafficking.   Delhi police claims that in four days about 3,000 missing children were located.   In the US,  they say, this technology has helped them identify  soldiers killed in civil war in the 19th century.   And it has also helped spot genetic diseases in different races through the technique.   There is no doubt,  in the domain of crime prevention,  this software is capable of preventing crimes and locating culprits fast.   But when digital arms like facial recognition sofrware become tools that can be used in any manner without clear red lines,  the threats it raises agaist civil rights and democratic freedoms  cannot be lost sight of.    In China,  the core of the mass surveillance system being successfully implemented now,  is digital technology of this kind.  It is also a captive system that controls all movements and travels of the targeted populations.  A digital mechanism that targets criminals alone is totally impossible.  For, if it has to work,   it has to gather the personal data of all citizens.  Aadhar data of 90 percent of the population of 135 crore are already in the database.   But the violation of civil rights starts with the fact that biometric data consisting of finger print,  iris and face,  can be used without the consent of the individuals concerned.

When India introduces AFRS today,  there are no democratic laws that hold it under checks.   The Supreme Court verdict had in its Aadhaar case judgement (of 2017),  declared that privacy is a constitutionally guaranteed fundamental right of every individual.    But in practice,  what the government does is employing ‘digital tools’ without making necessary rules and regulations.   The personal data protection law of 2018 has not come into force.   In other words,  it is when there are no legal protections that a machinery is coming into being with huge chances of being misused. Vidushi  Marda,  who has conducted studies about this,  points out that wherever this was employed around the world,  it was used against minorities,  women and children.  There is also the problem that facial recognition system is not hundred per cent foolproof.  According to a report by Delhi police,  out of  100 results obtained by using face recognition method,  98 were wrong; only 2 percent faces were identified correctly.  That is to say,  the chances of ‘identifying’ innocent as a criminal is as high as 98 per cent.   What will be the situation if this ‘facial test’,  with high chances of error,  is implemented under a government working in a totalitarian style,  without any legal protection?  This is a time for people’s representatives and activists to intervene and to exert pressure for  adequate legislation.

News Summary - 'Facial' marks of totalitarianism