Facial Recognition Technology (FRT) is used today in law enforcement, border control, retail, mobile technology and in banking and finance. It helps us unlock our phones, board a plane, identify stalkers and shoplifters, to gain admission to schools and workplaces.
The technology matches an image or video footage with those stored in a database. It can be used for verification (your phone or passport), identification (a search of CCTV footage for suspects, crime victims or missing persons), or for discerning someone’s emotional state (aggression that might escalate or driver fatigue that might cause a serious accident).
In 2022 the facial recognition market was worth approximately $5 billion and was expected to double by 2025.
Advocates of facial recognition technologies, says Rita Dr Matulionyte, point to the way the technology can make transactions smoother and faster and to the many ways it improves community safety through surveillance and policing.
Critics note its impact on privacy, the fallout from data breaches, and its inaccuracies when it comes to identifying members of disadvantaged groups: black people, women, individuals from ethnic minorities, members of the LGBTI community.
FRT uses images of potentially millions of people who have not agreed to be scanned and have no say in how their images are used.
“FRT is more likely to misidentify individuals from these groups, which can have far-reaching effects and see people wrongfully accused," says Dr Matulionyte.
Alarm bells have been ringing. A recent Choice investigation revealed the extent to which facial recognition technology is being used at concert venues and sporting events, often without consumers’ knowledge or consent.
Department stores are rethinking their instore FRT. Kmart and Bunnings, which installed FRT to protect customers and reduce theft, last year halted its use ahead of an investigation by the Office of the Australian Information Commissioner (OAIC) into whether it was consistent with privacy laws. The findings are pending.
Citizen watch
“FRT uses images of potentially millions of people who have not agreed to be scanned and have no say in how their images are used,” says Dr Matulionyte.
Warning: Dr Rita Matulionyte, pictured, says information collected using facial recognition technologies can be misused in many ways.
If uncontrolled, FRT installed in public spaces could lead to the kind of widespread public surveillance already established in countries like China. Remember when the city of Suzhou used the technology to publicly shame seven people who left their homes in their pyjamas?
“In theory, FRT allows government and private sectors to constantly monitor what citizens are doing, where they are going, what they are buying, and how they spend their time.”
In her paper, Increasing Transparency around Facial Recognition Technologies in Law Enforcement: Towards a Model Framework, Dr Matulionyte notes the lack of transparency on how government and the private sector are using facial recognition technologies. Australian Federal Police initially denied trialling this technology before it was publicly confirmed.
The more informed we are about the technology, Dr Matulionyte says, the better we can decide how we go about it.
“When programming our mobile phone, we are free to choose whether we will use facial recognition at all or opt for another identity verification method," Dr Matulionyte says.
“Even then, we often do not know for what exact purposes the mobile phone company is using our image, with whom it is being shared, whether it is stored securely and what happens if our image is stolen.”
The databases that FRT feed into are at risk of being hacked, she says. “Images of individuals and their associated data can be leaked, their identities stolen.”
Over the past 12 months, Apple, Meta, Twitter, Optus, Medibank and Latitude have disclosed cybersecurity attacks.
FRT used and abused
Information collected using facial recognition technologies can be misused in many ways, says Dr Matulionyte.
“Data breaches and identity theft are just some of them. Stolen images and personal details can be used by criminals to register new bank accounts and conduct transactions in your name. If a bank uses a face recognition system to verify your identity, criminals could use illegally acquired images to [remotely] convince the bank that it is you doing the banking.
“Retailers might use FRT not only for security purposes but also to track individual behaviour, create profiles about their customers and then use targeted advertising. If you have just been shopping for watches, for example, do not be surprised if screens in the shopping mall show advertisements for them when you approach.”
In some countries, customers can pay for their goods by simply looking at a camera that recognises their face and matches it with an image and bank details provided by the customer.
FRT is also impacting our right to demonstrate. Civil organizations around the world have complained that police have been using it to monitor gatherings, recording their faces, and using FRT to identify individuals. Such use of FRT is likely to have a chilling effect on peaceful protests.
A variety of laws around the world have been proposed to control and regulate the use of FRT. For instance, the proposed EU AI Act bans certain uses of facial recognition, especially by law enforcement.
In 2021 the Australian Human Rights Commission proposed a moratorium on facial recognition technologies until its risks had been properly addressed through legislation. Last year the Human Technology Institute proposed a Model Law on how this technology could be regulated. Government is currently consulting with the public on how AI technologies, including facial recognition, should be regulated but is still to take action in the field.
Researchers have voiced a need for clear legal rules on what FRT uses are allowed and which should be prohibited. Many favour informed consent. Evidence-based research and broad stakeholder discussion are needed to define the future of FRT in our societies and the rules that need to be set.
Dr Rita Matulionyte is an international expert in intellectual property and information technology law, with a recent focus on legal regulation and governance of Artificial Intelligence technologies.
She is the author of Facial Recognition Technology in the Modern State to be published later this year by Cambridge University Press. Dr Matulionyte is an Affiliate of the ARC Centre of Excellence for Automated Decision-Making & Society at Macquarie University.