The facial recognition and geo-location technology being used in home quarantine trials lacks robust privacy protections, digital rights groups have told state and federal health ministers, warning the data collected needs strong safeguards like those legislated for the COVIDSafe app.
In an open letter to Australia’s health ministers on Thursday, the Human Rights Law Centre and Digital Rights Watch called for stronger privacy protections for the “extreme” technology being used to check if people are remaining in their homes during quarantine.
South Australia is currently trialling a home quarantine system as a cheaper alternative to the hotel system used throughout the pandemic. It uses a smartphone app with facial recognition to verify a user’s identity and location.
Under the South Austraian scheme, people quarantining at home are contacted at random times, requiring them to prove their identity and location within 15 minutes through the app’s geo-location and facial recognition. If the individual does not do this, they are visited by SA police.
New South Wales, Victoria and Western Australia have announced similar home quarantine trials which rely on facial recognition, and Prime Minister Scott Morrison has backed the approach for wider expansion.
But there are concerns about the privacy risks and security of the data being collected, which in some cases can be held until the “conclusion” of the pandemic.
Rights groups flagged the risks early on and have now written to all state, territory and federal health ministers, urging them to introduce stronger protections.
The letter expresses concern with the “extreme measure” of using facial recognition technologies, which Australian and international human rights groups have previously said should not be used without a strong regulatory framework because of access, accuracy and racial bias problems.
“In Australia, there is no such regulation to ensure that the use of facial recognition technology is necessary, proportionate, and protects human rights in its application,” the letter states.
“Current facial recognition technology has also been shown to exhibit gender and racial biases.”
The group acknowledges the need to transition to home quarantine, and that technology has a role to play in the system, but says it needs to come with safeguards modelled on the legislated protections of the federal COVIDSafe contact tracing app.
The national app was introduced with legislation which prevents anyone other than state health authorities from accessing the data. Health authorities can only access the data with permission and only use it for contact tracing purposes.
Several of the state-based check in apps that have followed lacked similar legislated protections, opening the data up to access by non-health officials, including police which have accessed check in information collected by the apps as part of investigations not related to COVID-19.
“Significant effort went into ensuring that the legislation governing the COVIDSafe app had a strong focus on privacy. This should be the baseline for any technological approach to managing the public health response to COVID-19,” said Digital Rights Watch program lead Samantha Floreani.
“The information collected by the home quarantine app, as well as that collected via QR ‘check ins’, is no less sensitive to that which was to be collected by the COVIDSafe app. Individuals should be able to trust that the personal information they are providing will only be used to support the public health response to COVID-19, and nothing else.”
The groups recommend personal information only be collected, used and disclosed during home quarantine to manage it, and only to the extent required to do so.
They recommend data should be held no longer than is necessary to meet specific public health requirements and that all biometric data should be destroyed after the quarantine period and not be able to be linked to other datasets.
The rights groups also ask for any home quarantine apps to perform on-device identity and location-checking, rather than the verification taking place remotely.
“This would be a more secure and privacy-enhancing approach,” it said.
“It would mean that facial recognition technology could still be used to meet the need for identity verification, but the data would remain on an individual’s device, rather than their biometric data being transferred and stored in a centralised database.”
Governments should also commit to regulate the use of facial recognition and other biometric technologies with “robust human rights safeguards”, the groups said, repeating calls made by the Australian Human Rights Commission.
Do you know more? Contact James Riley via Email.