An invasive and inefficient tool

Use of facial recognition technology in law enforcement can have disastrous consequences
The Automated Facial Recognition System (AFRS) recently proposed by the Ministry of Home Affairs is geared towards modernising the police force, identifying criminals, and enhancing information sharing between police units across the country. The AFRS will use images from sources like CCTV cameras, newspapers, and raids to identify criminals against existing records in the Crime and Criminal Tracking Networks and System (CCTNS) database. The Home Ministry has clarified that this will not violate privacy, as it will only track criminals and be accessed only by law enforcement. However, a closer look at facial recognition systems and India’s legal framework reveals that a system like the AFRS will not only create a biometric map of our faces, but also track, classify, and possibly anticipate our every move. Technically speaking, it is impossible for the AFRS to be truly used only to identify, track and verify criminals, despite the best of intentions. Recording, classifying and querying every individual is a prerequisite for the system to work. Accuracy rates of facial recognition algorithms are particularly low in the case of minorities, women and children, as demonstrated in multiple studies across the world. Use of such technology in a criminal justice system where vulnerable groups are over-represented makes them susceptible to being subjected to false positives (being wrongly identified as a criminal). Image recognition is an extremely difficult task, and makes significant errors even in laboratory settings. Deploying these systems in consequential sectors like law enforcement is ineffective at best, and disastrous at worst. Facial recognition makes data protection close to impossible as it is predicated on collecting publicly available information and analysing it to the point of intimacy. It can also potentially trigger a seamless system of mass surveillance, depending on how images are combined with other data points. The AFRS is being contemplated at a time when India does not have a data protection law. In the absence of safeguards, law enforcement agencies will have a high degree of discretion. This can lead to a mission creep. The Personal Data Protection Bill 2018 is yet to come into force, and even if it does, the exceptions contemplated for state agencies are extremely wide. The notion that sophisticated technology means greater efficiency needs to be critically analysed. A deliberative approach will benefit Indian law enforcement, as police departments around the world are currently learning that the technology is not as useful in practice as it seems in theory. Police departments in London are under pressure to put a complete end to use of facial recognition systems following evidence of discrimination and inefficiency. San Francisco recently implemented a complete ban on police use of facial recognition. India would do well to learn from their mistakes. Vidushi Marda is a lawyer and researcher at Article 19, a human rights organisation

 Source : https://www.thehindu.com/todays-paper/tp-opinion/an-invasive-and-inefficient-tool/article28631075.ece

About ChinmayaIAS Academy - Current Affairs

Check Also

India Semiconductor Mission

India Semiconductor Mission (ISM): Building a Domestic Chip Ecosystem

Introduction The Indian government, recognizing the critical role of semiconductors in the digital age, launched …

Leave a Reply

Your email address will not be published. Required fields are marked *

Get Free Updates to Crack the Exam!
Subscribe to our Newsletter for free daily updates