Facial Recognition Expanded In India While Concerns Over Lack Of Law Protecting Data Grows

Recently across some of India’s busiest airports and train stations, facial recognition technology (FRT) software systems are being hooked up with a progressively spreading network of closed-circuit cameras by multiple state-owned agencies to pan through databases of photos to identify people on a real-time basis. The systems seek to achieve a range of objectives: better identification of criminals, law enforcement use at railway stations, passenger check-ins at airports, biometric attendance at companies, and even student authentication mechanisms. Across some of India’s busiest airports and train stations, facial recognition technology (FRT) software systems are being hooked up with a progressively spreading network of closed-circuit cameras by multiple state-owned agencies to pan though databases of photos to identify people on a real-time basis.

The growing list of users of this technology, which started with the Home Ministry’s National Crime Records Bureau (NCRB) and various police forces, now includes the Airports Authority of India, the Indian Railways, public sector utilities, and the state-owned agency mandated to issue a unique identity to all residents of India. FRT software vendors include both domestic firms and global companies. The systems seek to achieve a range of objectives: better identification of criminals, law enforcement use at railway stations, passenger check-ins at airports, biometric attendance at companies, and even student authentication mechanisms. To enhance safety and security, various authorities have installed CCTV cameras in public places. However, once a database of images is consolidated, procurement of facial recognition technology fed with this data shifts the goalpost for citizens in terms of privacy. Experts have called for data privacy laws. FRT systems are in the process of being deployed at airports in Kolkata, Varanasi, Pune, Vijayawada, Bengaluru, and Hyderabad as part of a trial under the Ministry of Civil Aviation’s Digi Yatra initiative.

For four of these airports — Kolkata, Varanasi, Pune and Vijayawada — that are managed by the Airports Authority of India (AAI), Japanese electronics company NEC has been roped in for the implementation. The project is expected to start by the end of this year. AAI said it is currently testing the solution at Varanasi airport. “The solution is designed as per prevailing industry standards with respect to data security & privacy. The consent of the user is taken before the biometrics are captured as part of the enrolment process to Digi Yatra program,” an AAI spokesperson said. As part of a broader Indian Railways plan to install facial recognition tech at railway stations to “identify criminals”, Western Railway has commissioned 470 video cameras featuring real-time FRT developed by the Russian video analytics firm NtechLab, which has been certified by the Research Designs and Standards Organisation (RDSO), a technical adviser and consultant to the Indian Railways.

The camera system, which is said to ensure simultaneous recognition of up to 50 people in a single frame, will be used on the busiest section of the network. The video analytics system can be used to “shape strategy” by counting passenger traffic on the network at any given time, alongside the stated objective of “identifying criminals” and “searching for missing persons”, according to the systems vendor. “Our video analytics technology employs high-precision, real-time face recognition mode, in the video stream. Images are compared with a database of wanted individuals. If there is a match, it notifies law enforcement immediately. The entire process, from the appearance of the person in front of a camera to law enforcement receiving a signal, takes less than three seconds. This enables a fast response to situations as they develop,” according to Andrei Telenkov, CEO of NtechLab.

The NCRB, which compiles crime statistics and maintains a database, is deploying “an automatic FRT system” aimed at facilitating “better identification of criminals, unidentified dead bodies & missing/found children and persons”. The Home Ministry has said that the automatic FRT system will use “police records and will be accessible only to Law Enforcement Agencies”. However, in March 2018, the Delhi Police, which comes under the Home Ministry, acquired an automated facial recognition software as a tool to identify lost boys and girls by matching photos, the data from which are learnt to have been subsequently fed into the automated facial recognition system to identify people who repeatedly turned up at protests, and who were photographed during the riots of last year. The software deployed by Delhi Police is learnt to have been supplied by the Delhi-based tech company Innefu Lab, which describes itself as a security, analytics, and intelligence firm. The company lists Delhi Police as a client on its website, in addition to “more than a dozen LEA departments” where its solutions have been deployed.

In December 2018, Uttar Pradesh Police deployed a software called Trinetra developed by Gurgaon-based company Staqu to “zero in on the criminal” in a quick and targeted manner using techniques such as facial recognition, biometric record analysis, etc. The database at the time was created using criminal records of the state police, the prisons department, and the Government Railway Police. Besides law-enforcement agencies, utilities too are leveraging the technology. State-owned NTPC Ltd has started implementing FRT alongside biometrics to capture the attendance of employees. As per NTPC’s policy, consent of employees “shall not be” required for implementation of FRT. A red flag that has been raised is that the extensive use of FRT systems is taking place in the absence of data protection laws that would mandate necessary safeguards in the collection and storage of user data.

This is especially significant because other government agencies planning to deploy FRT systems include those with a much wider ambit — such as the Unique Identification Authority of India (UIDAI), which is developing the Aadhaar-based Face Authentication in Proof of Concept (PoC) phase to supplement authentication mechanisms in addition to biometric and iris-based authentication procedures. Also, the Central Board of Secondary Education (CBSE) is using facial recognition for one-to-one face matching as one of the authentication mechanisms for issuing digital marksheets to students. The Ministry of Education has informed Parliament that there is no collection or storage of biometric facial data, and the use of the application is based on the consent of the individual. A government official involved in the exercise said FRT is “distinct” from face authentication mechanisms being used by CBSE for digital marksheets. Apart from the fact that these systems are currently operating in a legal vacuum given that India does not yet have specific laws with regard to FRT and personal data protection, experts have also flagged the issue of lack of informed consent.

While individuals in a CCTV-surveilled area may be aware they are under surveillance, the use of images gathered from CCTV networks in conjunction with FRT would mean their images will be stored for longer, if not permanently. “This data will also be used to extract particular data points such as the facial features and other biometrics, which the individual has not consented to sharing when entering a CCTV-surveilled zone, and these data points can be used to track future movements of the person. Therefore, integration of FRT with a network of CCTV cameras would make real time surveillance extremely easy,” the non-profit Internet Freedom Foundation wrote in a blog post on surveillance-related privacy concerns. Footage collected through CCTVs are governed by rules and regulations laid down by various states and local law enforcement authorities, and include aspects such as the time the footage is stored for and the uses to which it is put.

However, for all CCTV cameras, privacy is governed by provisions in The Information Technology Act, 2000, which prescribes “punishment for violation of privacy” for any person who “intentionally or knowingly captures, publishes or transmits the image of a private area of any person without his or her consent, under circumstances violating the privacy of that person”. The Indian Express reached out by email to the Railway Board, NTPC, and the Ministry of Education requesting comments for this report, but got no responses.

(Courtesy: The Indian Express)

Leave a Reply

Your email address will not be published. Required fields are marked *

More Related Stories

-+=