Facial recognition: How closely is Big Brother watching?

Source :SIFY
Last Updated: Tue, Mar 30th, 2021, 10:40:27hrs
  • Facebook-icon
  • Twitter-icon
  • Whatsapp-icon
  • Linkedin-icon
spying

As always, the intention is “noble”. When the National Crime Records Bureau (NCRB) issued a Request for Proposals (RFP), inviting bids for the creation of a National Automated Facial Recognition System (AFRS) last year, it held that its main purposes were to find missing children, spot criminals by collating data from other existing databases, and identify unclaimed corpses.

But here’s the problem—the faces of children change over time, criminals tend to hide their facial features when they are about to commit a crime, and bodies decompose.

This can be surmounted, says the NCRB. The system will provide matches by predicting facial modifications such as “plastic surgery, aged image, bearded faces, make-up, expression, hair-style, glasses, scar marks and tattoos”.

Which begs the question—just how accurate will these “predictions” be, given how many of us share similar features, from hooked noses to large ears.

And just how intrusive is it? With a budget of over Rs. 300 crore to create a national database of photographs, and multiple drafts of the proposal—which first said it would automatically collect information from the millions of CCTV cameras installed across the country, and then reworded it to state that the data sources would be “scenes of crime images or videos” (also known as CCTV cameras installed across the country)—no one really knows. 

Citing crime, particularly against women, as the reason, police departments are multiplying the number of CCTV cameras they operate. The Delhi Police announced it would set up 300,000 more in addition to the 250,000 currently under its control. One of the “sources” for the AFRS will be police department “databases”.

We are already being monitored without our consent. Globally, retailers use their in-store cameras to read the expressions of customers and gain an “insight” into their reactions to products. Well, one can evade that by shopping online—except that we are being monitored there too, and one often gets an alert that the browser is using the mobile phone camera. And soon, we may not even know we are being watched, because cameras don’t have to look like cameras anymore—they could look like earphones.

With the global facial recognition market set to expand to a $9.6 million dollar business by 2022, there is no shortage of players. And with governments outsourcing data collection and enlisting retailers who already operate their own surveillance, it’s a free-for-all.

Two years ago, Amazon decided to outsource the use of its facial recognition software, Rekognition, to the police, sparking nationwide fury in the US—only for Andy Jassy, CEO of Amazon Web Services, to defend “regulation” by the police.

When a computer decided all black men looked alike in January 2020, leading to the first acknowledged case of wrongful arrest through facial recognition in the US, several Silicon Valley giants, including Amazon, Microsoft and IBM backtracked. But then these giants are small players in the facial recognition software industry, whose actual leaders are non-household-names like Vigilance Solutions, Rank One Computing and NEC.

In India, state and central governments have boasted of their efficiency in tracking protesters by using driving licence and voter identity databases. Thousands of arrests were made for “rioting”, seemingly ignoring the fact that we have a constitutional right to peaceful assembly and cannot be considered criminals without legal evidence of having committed a crime.

While hearing the Justice K S Puttaswamy (Retd) case in 2017, the Supreme Court held that Article 21 of Indian Constitution which ensures ‘right to life and personal liberty’, and that privacy is a fundamental right even in public places. The Information Technology Act, 2000, terms biometric data “sensitive personal data” and sets out rules for the collection, disclosure and sharing of such information.

And yet, we have the Telangana police patting itself on the back for its use of surveillance to increase the conviction rate for criminal violations, and even said it was able to track people “suspected” of COVID-19 infection through Artificial Intelligence (AI).

The Central Board of Secondary Education (CBSE) used facial recognition to match admit card photos with the students logging in, purportedly to avoid cheating—but without any additional consent from the examinees or their parents, despite most of the former being minors.

We are being monitored without our consent, and often without our knowledge. But there is little recourse to law, because there are no legal safeguards in place about the use of facial recognition software.

The Internet Freedom Forum (IFF) sent a notice to the NCRB, demanding how they could introduce a system that violated the Supreme Court verdict. In response, the NCRB sent a vague note, citing the Cabinet Note of 2009, which “envisage(s) six specialised solutions”—ignoring the fact that a Cabinet note has no legal standing, let alone enough weight to overrule a Supreme Court verdict.

Our facial features are scanned through a plethora of devices, with and without consent, from transport department databases to social networks to random police checks. 

A few days ago, my car was stopped by the Election Commission of Tamil Nadu for a surprise check—the car and I were photographed without consent, and one of the personnel made to open the door without wearing a face mask. It was, thankfully, locked. The police wanted my phone number as well—all this despite having searched my car and found nothing worth suspecting. I asked to see the identity proofs of the police personnel before I allowed them to search the car, and they seemed surprised—and offended. But with police uniforms available at any costume store, what would stop criminals from posing as law enforcement authorities and collecting data, which a skilled hack could turn into a precious resource?

The IFF has raised questions about the consequences of false positive and false negative matches. With draconian laws in place that allow detention before being produced in court, personal vendetta could turn into a rights violation.
There is enough precedent in other countries.

China deploys AI for a controversial Social Credit System, and allegedly using facial recognition to track Uighur Muslims and pro-democracy protesters. Four years ago, Israeli firm Faception claimed its software could identify inclination to acts of terrorism and other deviant behaviour by reading faces.

How long before our own Big Brother adopts these methods?

More Columns by Nandini Krishnan:

Of celibates, temples, and kisses

Covid 19: The paranoia is important

Hathras: The power of silence

How could we not lose Kashmir?

SPB: A personal loss
 

 

Nandini is the author of Invisible Men: Inside India's Transmasculine Networks (2018) and Hitched: The Modern Woman and Arranged Marriage (2013). She tweets @k_nandini. Her website is: www.nandinikrishnan.com

  • Facebook-icon
  • Twitter-icon
  • Whatsapp-icon
  • Linkedin-icon