The Australian government still plans to create a national biometrics database allowing authorities to conduct facial recognition matches, despite increasing worldwide scrutiny on the ethical implications and effectiveness of the technology.
Australian states and territories are continuing to upload information to the not-yet-operational national database, while the legislation underpinning the new facial recognition capability still lags in the Parliament, having been rejected by the powerful national security committee last year.
The government remains committed to the legislation and is working on a re-draft. It is expected to soon reintroduce the legislation to Parliament.
Digital rights groups in Australia will soon launch a campaign against the use of facial recognition by law enforcement locally and have called for a moratorium on the use of technology until there was adequate privacy laws and safeguards in place.
The use of facial recognition technology by law enforcement has been thrown in the spotlight in recent weeks following protests over police brutality that began in America and have spread around the world.
Tech giants Amazon, IBM and Microsoft have all said they will not sell facial recognition products to law enforcement in the US for the next year, each with their own caveats.
There have been ongoing concerns round the accuracy of facial recognition technology used by police and its inherent racial biases and flaws, along with other privacy and security issues.
In Australia, the Commonwealth has planned for several years to create a national biometrics database with data fed from the states and territories – which add such biometric identifiers as drivers’ licence photos – to be used by various authorities, agencies and departments.
Most states have now agreed to provide biometric info to the database, with Queensland recently joining Victoria, South Australia and Tasmania in uploading its stores to the National Driver Licence Facial Recognition Solution. The remaining states are expected to follow over the next 18 months.
The database would allow these authorities to conduct one-to-one and one-to-many facial recognition matches using biometrics information provided by the states and territories under a COAG agreement.
The database is not yet operational and has not been used for face matching, due to the legislation not having passed Parliament.
The Identity-matching Services Bill was first introduced to Parliament nearly two years ago. It paves the way for the “secure, automated and accountable exchange of identity information between the Commonwealth and state and territory governments”.
But the legislation was extraordinarily knocked back by the Parliamentary Joint Committee on Intelligence and Security, which said a “significant amount of redrafting and not simply amending” was required.
The legislation should be redesigned around privacy, transparency and robust safeguards, and needs to be subject to Parliamentary oversight and annual report, the committee’s report found. The bipartisan committee also criticised the bill over a complete lack of detail.
But the Coalition is still committed to launch this national database, dubbed ominously “The Capability”, and is currently working to redraft the bill.
“The Identity-matching Services Bill is currently before Parliament. The PJCIS has indicated its support for the overall objectives of this Identity-matching Services Bill. The government is carefully considering the committee’s report,” a spokesperson for the Department of Home Affairs told InnovationAus.
The states still uploading information to the national database are ignoring the suggestions of the powerful national security committee, Deakin University senior lecturer Dr Monique Mann said.
“While the PJCIS threw out the bill because it was insufficient in offering protections, we’ve seen states move ahead, effectively ignoring those recommendations,” Dr Mann told InnovationAus.
“The committee rejected the bill because of insufficient protections, insufficient transparency and broad powers for the minister,” she said.
“We’ve not seen a redrafted version of the bill, it’s in a bit of a limbo, and it’s very much that the states are exploiting that and just proceeding ahead anyway.”
Last week Microsoft, Amazon and IBM all halted the use of their facial recognition technology by police in the US, following Black Lives Matter protests around the country.
There needed to be a similar pause on the use of this technology by Australian law enforcement until there are adequate privacy laws in place, Digital Rights Watch chair Lizzie O’Shea said.
“The call for a moratorium is something we plan to campaign on. We don’t think the technology is trustworthy and we think it is being experimented with by law enforcement,” Ms O’Shea told InnovationAus.
“In a situation with very little accountability and regulation, we think that’s dangerous,” she said.
“It’s good these [big tech] companies are leading on this point, and hopefully it’ll start a bigger reflection on the role of this technology and we’d like to see governments following. There’s no guidance about what laws govern the use of this technology.”
“It’s very clear there’s a vacuum and how it gets filled is a question that will be resolved in the near future. We hope it prioritises human rights.”
There isn’t any real transparency around the use of facial recognition technology by police agencies in Australia, although a number of state authorities have confirmed they do use the technology in some form.
It was recently revealed that the Australian Federal police and state police in Queensland, Victoria and South Australia had trialled the use of Clearview, a controversial tool allowing law enforcement to upload an image and compare it to the company’s huge database of billions of publicly available images on the internet.
“That highlighted that we don’t have an appropriate legal framework governing any of this, and there’s limited transparency related to what law enforcement is doing. This is not a new thing – there’s been a history of tech companies providing tools to law enforcement, and this outsourcing to private companies is a way in which governments can avoid scrutiny and get around some of these protections,” Dr Mann said.
There needs to be laws in place that ensure accountability and responsibility around the use of this technology, and around what kind of data is fed to the algorithms, Ms O’Shea said, along with more transparency around its use and clear public spaces where the technology will not be used.
“We want to see some forms of notification around the technology being used, potentially ring fencing some parts of public life from this technology. We should be allowed to go round without being scrutinised and different agencies making use of this technology,” she said.
“We’re always saddled by the fact we don’t have a human rights instrument or respect for human rights as part of our culture of law-making. That is a big problem.”
Australia is particularly vulnerable to the unchecked use of technologies like facial recognition due to a lack of legal privacy rights, Dr Mann said.
“We don’t have fundamental human rights, we don’t have enforceable protections and law enforcement agencies are completely exempt from privacy laws, and there’s no way anyone can challenge this. There’s no governing legal framework for the use of these technologies,” she said.
“We need to start questioning the type of society we want to live in and the role of technology, and the role of private surveillance companies developing this technology for oppressive policing practices.”
There is an opportunity at the moment to bring the public on board as part of these campaigns against the use of facial recognition technology, Ms O’Shea said.
“There is scope to increase campaigning that is built on community engagement with this issue. Most people would be worried about the use of this technology without scrutiny and oversight,” Ms O”Shea said.
“They may not be concerned about facial recognition in everyday use like on their phones, but when that’s in the hands of government then there are clearly problems,” he said.
Human Rights Commissioner Edward Santow has also called for a moratorium on the use of facial recognition technology in decision-making that has a legal effect for individuals until an appropriate legal framework is in place.
“What we’ve seen with new technologies again and again is that they can simultaneously make our world more inclusive and also have the opposite effect. What we’re really focused on with that part is things like human rights by design,” Mr Santow said.