Public Affairs

Facial recognition technology: A human rights perspective

St Mary’s University College Belfast Senior Lecturer in Politics, Dr Birgit Schippers speaks to agendaNi about the hot topic of law enforcement, facial recognition technology, and the debate it has sparked, from its profiling of minority communities to fears around the sharing of biometric data between public bodies and private corporations.

“Facial recognition technology (FRT) is a biometric technology that records and analyses our facial images. It uses biometric data, our faces, in a wide variety of contexts,” Schippers explains. “For example, our faces can work like a password: we have phones that can be unlocked with facial recognition technology. It is used in the home and increasingly in businesses such as shopping centres, bars, and leisure centres. It is also used in the provision of public services or in policing and border control. But while a password can be changed fairly easily, our biometric data is highly personal and sensitive. That’s what makes this issue so pertinent.”

Facial images can be collected and analysed live, in near-real time, often without our knowledge or consent, which, as Schippers contends, is a problem. FRT also works with historical images that are held for example by private corporations or state agencies. Once these images have been captured and processed, they can be used to match individuals against an image database, such as passport or driving license pictures, police watchlists, or even against images scraped from our social media accounts.

These days, FRT is rarely out of the news. For example, attempts have been made recently to introduce FRT into schools. Already prevalent in the Chinese education system and increasingly widespread in the United States, schools in France and Sweden have sought to introduce the technology. A secondary school in Kilkenny was forced to scrap a scheme intending to use FRT to monitor attendance when it was informed by the Irish Council for Civil Liberties that such a scheme may be in breach of European data protection regulations.

The analysis of facial expressions can also be used for emotion recognition. “Technology that claims to establish our emotional states from our facial expression is becoming popular in China and its deployment is deeply worrying,” Schippers says.

It is in the human rights arena, and FRT’s impact on human rights, that Schippers’s work focuses. “Human rights are interrelated and interdependent. This makes it impossible to identify a single right that is not affected by the deployment of FRT,” she says. “The debate has focused on the right to privacy, but this technology impacts across the whole spectrum of civil, political, social and economic rights. For example, there are persistent concerns over its accuracy rates, which are lower when analysing darker-skinned and female faces. This gender and racial bias stems from the predominantly white, Asian, and male images that populate the data training sets used to develop FRT.”

Worries over the gender and racial bias of FRT are compounded if public agencies, especially those working in sensitive areas such as the police, are unduly influenced by this technology. This is contrary to the right to human review over decisions that affect our lives and not to be subjected to algorithmic decision-making systems. “It would be hasty and ill-considered for any police force to use a technology that produces biased outcomes, and whose efficacy and accuracy remains an issue of concern,” Schippers argues. “The discriminatory effects of FRT could undermine trust in the fairness of policing, and will leave the police vulnerable to accusations of bias and discrimination.”

The invasive impact of FRT remains a key worry. “Deployed in public spaces, FRT leads us on a path towards automated blanket surveillance, very often without our knowledge. It is a blunt instrument that violates the principles of necessity and proportionality; as the UK Information Commissioner’s Office has argued, the collection and processing of data with FRT must be targeted, specific, intelligence-led, and time-limited. Also, the use of live FRT by the police in spaces frequented by the public denies citizens the opportunity to give informed consent to the recording and analysing of their images.”

The capacity for mass surveillance also threatens our democratic right to freedom of expression and assembly, Schippers says. “FRT has a chilling effect on our democratic culture and on our rights as citizens in a democratic society. Blanket surveillance facilitated by FRT can deter individuals from attending public events, and it can stifle participation in political protests and campaigns for social change. We already have evidence that suggests that citizens, especially people from Black and ethnic minority background, and younger people, are reluctant to participate in public protests if their faces are recorded and analysed by FRT,” she says.

Deployed in public spaces, FRT leads us on a path towards automated blanket surveillance, very often without our knowledge. It is a blunt instrument that violates the principles of necessity and proportionality.

While FRT enables mass surveillance, it can also be used selectively. FRT-operated police body cameras and handheld mobile devices with a FRT app are one example Schippers cites to illustrate how biometric data “can be used to target specific communities, very often ethnic minority communities who are already over-policed and whose relationship with the police is already fraught. I do not see FRT as a useful tool for policing, quite the opposite,” she says. “It raises the spectre of enhanced racial profiling at the street level.”

There are additional fears over data security and data breaches, but a particular concern relates to the sharing of our facial images between public agencies and private corporations. “We simply don’t know what happens to our images, where they are stored, for how long, and why, or who has access to them,” Schippers says. “But there are also worries over the sharing of sensitive biometric data between public bodies and private corporations. Evidence from the United States attests to the close relationship between the business interests of private corporations that develop and sell FRT, and its use by state agencies, such as the controversial US Immigration and Customs Enforcement.” The sharing of biometric data happens on this side of the Atlantic, too. “Last summer, the Metropolitan Police admitted that it gave images to the privately owned, but publicly accessible King’s Cross Estate between 2016 and 2018.”

However, opposition to the use of FRT by police forces, especially in the context of the Black Lives Matter movement, is gathering momentum, and this trend is shaping the decisions of big tech corporations. IBM has just announced its departure from the FRT business; Microsoft has halted the sale of its FRT to police forces, and Amazon has introduced a one-year moratorium on the sale of FRT to the police.

The Scottish Parliament Justice Sub-Committee on Policing has recently rejected plans to invest in live FRT, a decision that Schippers welcomes. “FRT is often presented as a useful policing tool, but I do not think that it is particularly helpful to the police. I have serious concerns that it will undermine relationships between the police and the communities it is meant to serve.”

Human rights and civil liberties activists are warning that the development and deployment of FRT should be halted until its compliance with national and international human rights law can be fully ascertained. As Schippers explains, there is a growing movement calling for a moratorium on the use of FRT. For example, a UK Private Members’ Bill, which seeks to prohibit the use of automated FRT in public places, is currently debated in Westminster. A similar bill, which aims to prohibit biometic surveillance by federal government, has been introduced in the United States Congress. Schippers also says that there is a need for a “much wider ranging public conversation and debate” on the topic than there currently is. “What we need is sufficient and independently verifiable evidence that this technology is fully compliant with national and international human rights and equality obligations and standards,” she concludes.

Show More
Back to top button