Chennai police’s use of facial recognition technology can do more harm than good

FACETAGR, Facial recognition, Aadhaar, Criminals, Safety, Privacy, Chennai police
Facial recognition is being widely criticised around the world. Photo: iStock

India unfortunately doesn’t have a privacy protection law as many democracies do. Digital technologies are double-edged swords. It is more likely that the design and use of data and digital technologies can potentially be intrusive and erode human rights.

Traditional police watch-towers and other surveillance methods had only a limited reach and were primarily based on immediate needs. But in an age when high-tech cubicles in police headquarters can replace the watch-towers and conventional intelligence gathering, a democracy can degenerate into a virtual surveillance society where surveillance gets universalised.

In India, the Chennai Police are at the forefront of this highly controversial venture. They have been using FACETAGR technology, which involves an AI-based face-recognition app that can be used in CCTV cameras as well as smartphones. Faces within a crowd can be compared in real time with facial features of criminals and other wanted persons available in the police database.

However, to track the movements of criminals 24X7, the database would include not only ‘criminals’ but almost all citizens so that the identity and address of a suspect caught on a CCTV camera or a smartphone can be instantaneously traced. In the bargain, the facial features of every Indian would get reduced to “data” and go into the hands of the police.

To continue reading this article...

You have to be a Premium Subscriber

Start your subscription with a free trial

Enjoy unlimited Eighth column, archives and games on
thefederal.com and many more features.
You will also be supporting ethical and unbiased journalism.
plans start from Rs. 99
Get breaking news and latest updates from India
and around the world on thefederal.com
FOLLOW US: