Model facial recognition law would ban high-risk use in Australia


Joseph Brookes
Senior Reporter

Australia urgently needs dedicated facial recognition laws to stop the current slide towards a surveillance state and reduce the serious risk of error in high stakes applications like policing, according to new analysis.

The report from the University of Technology Sydney is calling for the Attorney General’s Department to adopt a model law for facial recognition technology (FRT) that would see a regulator develop technical standards, oversee mandatory human rights risk assessments, and provide advice to developers, deployers and affected individuals.

Both the regulator and an individual affected by an FRT developer or user would have review rights with determinations possible.

It would be a marked shift in Australia, where no dedicated FRT law exists. The current limited regulation of FRT comes from a mix of privacy and anti-discrimination laws, and some state-level human-rights laws.

It is understood the Attorney General and his department, which would need to draft and pass a bill to make it a reality, has already engaged with the experts behind the proposal at the UTS Human Technology Institute, including former Australian Human Rights Commissioner Ed Santow.

The new law could come through a standalone bill, an amendment to existing privacy laws, or be part of the wider reforms to the Privacy Act.

Attorney General Mark Dreyfus has signalled FRT could be best addressed through the Privacy Act review, already underway for two years and scheduled to be with government by the end of 2022.

“It’s certainly true that when [current] privacy laws were being drafted the lawmakers didn’t anticipate the massive rise of FRT and so our laws were simply not created to address the kind of problems as well as the positive opportunities that come from facial recognition,” Mr Santow told InnovationAus.com.

Mr Santow joined the UTS project after five years as Australia’s Human Rights Commissioner. His tenure included a landmark report on Human Rights and Technology that called on the government to halt the use of facial recognition and algorithms in important decision-making until adequate protections are in place.

The recommendations were backed by other experts but not taken by the former Coalition government which gave no formal response to the Commission’s three-year investigation.

The Coalition’s own attempt at deploying a FRT system for identity matching was rejected by the Parliamentary Joint Committee on Intelligence and Security in 2019 in a rare rebuke from the bipartisan committee that included now-Attorney-General Mark Dreyfus.

“That bill was rightly criticised across the political spectrum,” Mr Santow said.

“It was the first time in over 15 years that the PJCIS unanimously agreed that the bill should not go forward. The [PJCIS] report was very clear. They said that we need legislation in this area, and that legislation needs to have stronger privacy and other protections.”

The bill was not redrafted, however, and FRT development and deployment has increased apace.

“We know that facial recognition globally is increasing at an annual rate of about 20 per cent,” Mr Santow said.

“But most uses of facial recognition are not obvious or even they are actively hidden. In areas like schools, workplaces, when police use facial recognition, even in stores, it’s often something that is not apparent to people who are having the technology used on them.”

Several high-profile incidents have exposed the covert use of FRT at Australian convenience stores, retail outlets, airports, and within prisons. Widespread criticism from consumer, digital rights, and civil society groups has followed, along with renewed calls for more effective laws.

Mr Santow and fellow experts at UTS aren’t waiting. The UTS Human Technology Institute was established last year and began work on a model law which could allow much safer deployment of FRT.

The group is proposing a new risk-based approach to FRT, which wouldn’t outlaw the technology altogether but would impose greater restrictions on use cases where harm to human rights is more likely and more serious.

The model law requires that a human rights risk assessment should be undertaken through a “facial recognition impact assessment” that considers the specific application’s functionality, special context, performance, whether outputs create a significant effect, and whether affected individuals can provide or withhold free and informed consent.

For example, a deployment in a public space that conducts facial analysis on individuals without consent but has limited accuracy and still creates a certain automated decision would be assessed as having an “extreme” risk to human rights and would likely be prohibited under the model law outside of special circumstances.

At the other end of the scale, even limited deployments on individuals giving consent and not being used for automated decisioning would still be deemed to have a “moderate” risk to human rights and require the developer or deployer to identify and manage the risks.

The impact assessment would be published and able to be reviewed by the regulator and affected individuals.

A new or existing regulator like the Privacy Commissioner would be given more resources and more power to monitor FRT development and use, verify the risk-assessment process, and provide recourse to affected individuals.

According to the UTS report, the approach would “foster innovation and enable the responsible use of FRT, while protecting against the risks posed to human rights”, and would “incentivise FRT applications where this risk is lower”.

Asked about the proposal, a spokesperson for Attorney-General Mark Dreyfus pointed to the long running review of Australia’s Privacy Act as an avenue for better protecting sensitive personal information.

“This includes considering what privacy protections should apply to the collection and use of sensitive information using facial recognition technology,” the spokesperson told InnovationAus.com.

“The Attorney-General’s Department is working towards providing a final report on the Privacy Act review to the Government by the end of 2022.”

Mr Santow said the ongoing Optus data breach scandal had underscored the need to urgently address the risks of collecting sensitive information and fears the consequences could be even worse if biometric data is compromised.

“We’re talking here about biometric information, kind of the most sensitive personal information that there can be. And so when you create a facial recognition system, you’re essentially creating a honeypot that is very attractive to hackers.”

The Tech Council of Australia, which represents both developers and users of FRT, said it welcomed the release of the UTS proposal for a model law.

“The Tech Council supports well-designed regulation and standards and we welcome the release of the report as an important milestone in defining what an Australian model law FRT framework could look like, and the principles that should guide it,” the group’s chief executive Kate Pounder said in a statement.

“In particular, we strongly support a nationally consistent model of FRT regulation given many applications of FRT will be rolled-out nationwide or in multiple states and territories.”

Do you know more? Contact James Riley via Email.

1 Comment
  1. Digital Koolaid 2 years ago

    Thanks Joe. Australia urgently needs to OUTLAW facial recognition to stop the “current slide towards a surveillance state” and reduce the serious risk to you, me and Australia (the US author Shoshana Zuboff calls it Surveillance Capitalism. It’s not the state. It’s private.). There’s an old saying that goes “to a man with a hammer everything looks like a nail”. Well, to people from the Law Department at UTS everything looks like a law. Ask yourself, have laws stopped crime, violence, rape, murder, theft, speeding, drugs, drink driving or J-walking? There are laws around all that. Let’s have a law to stop war, or we could just outlaw famine? That would fix global hunger. How about a law against bushfires, or floods? A law against global warming? The Law Department could fix all our problems in a minute. Sure, let’s have “dedicated facial recognition laws”. That will stop private companies letting your face get hacked. Sure, not. (“Enshrined”, how come there was no shrine in this article? I love a religious metaphor. Facial recognition “enshrined” …)

Leave a Comment

Related stories