New Delhi: Digital rights experts have warned that policing and secrecy in a police project in India to monitor women’s reactions through facial recognition technology to protect women harassed on the streets violates privacy.
According to the report, Commissioner of Police D.K. Thakur said that in Lucknow, about 500 km from New Delhi, police identified about 200 harassment sites where women often go and where most complaints are reported.
“We will install five artificial intelligence cameras that will send alerts to the nearest police station,” he said.
“These cameras will be activated as soon as they see the expression of distress on women’s faces,” he told reporters.
India is rapidly installing face recognition technology at airports, railway stations and cafes, with plans for nationwide systems to modernize the police force, gather information and identify criminals.
However, technology experts and privacy experts say the benefits are unclear and could violate the privacy or lead to more surveillance.
“How the technology works, how the data is stored and who can access the data has not been fully explained,” he said.
Anushka Jain, an associate lawyer for the Internet Freedom Foundation, a non-profit digital rights organization said the technology had not yet been tested, adding that increasing the number of police patrols could be a possible solution.
According to official data, India is one of the most dangerous countries in the world for women, with one rape occurring every 15 minutes.
Uttar Pradesh, with its capital Lucknow, is the least secure state with the highest number of crimes against women in 2019.
Roop Rekha Verma, a women’s rights activist in Lucknow, said the police often send back women who come to lodge complaints or fail to take action.
“And they want us to believe that they will take action based on our facial expressions,” she said.