Social activist from Hyderabad S.Q. Masood has filed a petition to Telangana High Court challenging the deployment of Facial Recognition Technology in Telangana.
In the petition, Masood has argued that the deployment of the technology is not backed by law, unnecessary, disproportionate, and is being done without any safeguards in place to prevent misuse.
Hyderabad city in Telangana, according to the reports and data, is one of the most surveilled cities in the world. “Hyderabad is on the brink of becoming a total surveillance city. It is almost impossible to walk down the street without risking exposure to facial recognition,” said Matt Mahmoudi, Amnesty International’s AI and Big Data researcher.
The petition was listed for hearing before a bench led by the Chief Justice of the Telangana High Court. The bench has issued notice on the petition after hearing submissions from Manoj Reddy who represented Masood in court. The case will now be taken up after court vacations which end on 15 January, 2022. New Delhi based Internet Freedom Foundation is providing legal support in the drafting of the petition.
According to IFF, this is India’s first legal challenge to the deployment of Facial Recognition Technology.
On 19 May last year, Masood, who is a consultant at several NGOs, was stopped by a group of police officers in Hyderabad while he was returning home from work. The cops directed him to remove his mask even though the pandemic was ongoing, and Hyderabad was reporting a large number of cases. They did so because they wanted to take his photograph. Masood refused but they captured his picture anyway. Concerned with how his picture may be used, and to find out the legal basis of the actions of the police officers, Masood addressed a legal notice to the Police Commissioner of Hyderabad with legal support from Internet Freedom Foundation. The activist did not receive any answer, but he decided to investigate further as he was concerned that the picture was taken for facial recognition.
Concerned with the widespread use of Facial Recognition Technology (FRT) by Telangana authorities, Masood has filed a petition in the public interest on behalf of residents of Telangana who have unknowingly been subjected to FRT on a daily basis.
“We thank Mr. Masood for allowing us to lend our expertise in this important matter. We have been extensively working on Facial Recognition Technologies through the Project Panoptic, which has informed a lot of work that went into the petition,” read IFF note.
Extensive surveillance of Telangana is putting human rights at risk, global rights watchdogs raised concerns.
In addition to CCTV, they are concerned that law enforcement’s practice of using tablets to stop, search and photograph civilians without charge could be used for facial recognition.
“Facial recognition technology can track who you are, where you go, what you do, and who you know. It threatens human rights including the right to privacy and puts some of the most vulnerable in society at risk. The construction of the CCC has chilling consequences for the right to freedom of expression and assembly,” said Quinn McKew, Executive Director at ARTICLE 19, an international human rights organisation.
Amnesty also mentions a study by the Internet Freedom Foundation which found that Telangana state has the highest number of facial recognition technology (FRT) projects in India.
“There is currently no legislation in place to protect the privacy of citizens – facial recognition is a harmful and invasive technology and it is imperative that Indian authorities immediately stop the use of this dangerous technology,” said Anushka Jain, Internet Freedom Foundation’s Associate Counsel for Surveillance & Transparency.
Amnesty pointed out that the authorities in India have a lengthy record of using facial recognition technology in contexts where human rights are at stake, with recent examples including enforcing COVID-19 lockdown measures, identifying voters in municipal elections, and policing protests.
“The rights of Muslims, Dalits, Adivasis, Transgender communities, and historically disadvantaged sections of society, are particularly at risk by mass surveillance,” says the rights watchdog.
The research in India marks the latest phase of Amnesty International’s Ban The Scan campaign, following research into surveillance in New York City published earlier this year. The Hyderabad research is in partnership with the Internet Freedom Foundation and Article 19.
Amnesty International urged authorities for a total ban on the state and private sector use, development, production, sales, and export of facial recognition technology for mass surveillance purposes.
In recent years, Telangana has been a test site for increased usage of dangerous facial recognition technologies (FRT) against civilians, according to Amnesty.
Situated in Hyderabad’s Banjara Hills, the CCC will reportedly support the processing of data from up to 600,000 cameras at once, with the possibility to increase this scope much further across the region. These cameras can be used in combination with Hyderabad Police’s existing facial recognition software to track individuals.
Amnesty International, IFF and Article 19 mapped the locations of visible outdoor CCTV infrastructure in two sampled neighbourhoods in Hyderabad – Kala Pathar and Kishan Bagh, with the help of local volunteers. Based on geospatial analysis, it was estimated that in these neighbourhoods at least 530,864 and 513,683 square meters, respectively, was covered by CCTV cameras – a remarkable total of 53.7% and 62.7% of the entire area.
The rights group claimed that its Digital Verification Corps discovered footage of dozens of incidents shared on social media from November 2019 to July 2021 showing Hyderabad Police asking civilians to remove their masks and photographing them in the streets, refusing to explain why. Other cases have shown police randomly demanding both facial and fingerprint read from civilians.
Under India’s Identification of Prisoners Act of 1920, it is not permitted to take photographs of persons by police, unless arrested or convicted of a crime, and neither is the sharing of such photographs with other law enforcement agencies.
Amnesty International also said in a press release that they contacted five companies (IDEMIA, NEC India, Staqu, Vision-Box and INNEFU Labs) for more information regarding their facial recognition related activities in India, and to request any human rights policies they may have.
Of the companies contacted, only INNEFU responded to the July 2021 letter, stating “the user agency is not under any obligation to adhere to any terms and conditions of the vendor”, without granting further responses to any of the 14 questions posed by Amnesty International. In another letter responding to Amnesty International about a previous investigation, INNEFU admitted that it did not have a “stated human rights policy”, but that it was “follow[ing] Indian laws and guidelines”.
Under the UN Guiding Principles on Business and Human Rights, all companies must have a human rights policy in place, and take steps to identify, prevent, mitigate and account for the risks to human rights posed by their operations and any risks they are linked to through their business relationships, products or services.
Facial recognition technology inherently poses a high risk to human rights, and these five vendors have failed to demonstrate they are adequately addressing and mitigating the risks of providing this technology to government agencies, warns the Amnesty International.