The NSW Premier Gladys Berejiklian has announced that the state will trial a new home quarantine program in the upcoming weeks. This is a step long-awaited by many, and Berejiklian’s ambitious program to reopen NSW has been widely covered.
But one crucial aspect of the future of home quarantine has avoided the media spotlight.
Individuals participating in the home quarantine trial will be required to download an app that checks in on their compliance using geolocation and facial recognition software. This decision comes at a time of increasing concern over AI technology and criticism of insufficient regulation.
Following South Australia’s example, NSW trial participants must respond to a check-in request with a selfie within fifteen minutes. In Western Australia’s home quarantine program, this window shrinks to just five minutes. If the software doesn’t verify the check-in, police will be sent to confirm the individual’s whereabouts.
Genvis Pty Ltd – a Perth-based startup – says both NSW and Victoria are trialling their facial recognition products on their website. However, NSW and Victorian governments have not publicised these trials, and their responses to questions have been vague, pointing reporters to other departments and avoiding direct answers.
Critics have warned that facial recognition technology may be inaccurate, and can be biased. AI experts like Professor Walsh of UNSW have pointed out that facial recognition regularly fails to accurately identify people of colour, and is more likely to misrecognise women than men.
Racial misidentification has been a problem in facial recognition for years. An MIT study published in 2018 reported that major facial recognition systems were as much as 99% accurate in relation to white men. But errors increased to as much as 35% when assessing darker-skinned women. This points to a worrying potential for the app to deepen discrimination already felt under existing COVID measures.
Furthermore, the absence of specific legislation around these home quarantine apps leave many worried that law enforcement could go on to use people’s data for other purposes. This lack of safeguards on invasive cyber-related investigative tools has become a pattern over the past few years.
South Australia launched a similar home quarantine app policy in late August, first for interstate travellers and now for returning overseas ADF personnel. SA premier Steven Marshall claimed the government wouldn’t be storing any of the information provided on the app, and that it would be subject to stringent security. But there are no primary legal safeguards in place to back up these claims.
This is especially concerning considering the precedent of misuse of COVID-related data set by Australian police. Despite vague promises that COVID check-in data would not be accessed by police, state police have already accessed QR code data on at least six separate occasions for unrelated investigations. This included cases in Queensland, Western Australia and Victoria.
In all these cases data was accessed without a warrant, prompting the Australian Information Commissioner to call for a ban on law enforcement access to check-in data outside of contact tracing purposes.
The quarantine app’s privacy policy asserts that information “will be destroyed at the conclusion of the COVID-19 pandemic” unless required to enforce a breach of the state’s health directions.
I’m not an epidemiologist or a government official, but I doubt the ‘end’ of the pandemic will not come for a very long time. Or that when it does, it will be a definitive moment. And that’s without considering the vague ‘enforcement of health directions breach’ caveat.
A home quarantine app does have its benefits. It allows state governments to expand their quarantine capacities, eases burdens on compliance officers and is far more cost effective.
But this doesn’t cancel out growing concerns over AI-based facial recognition technologies. Just last week, UN Human Rights Chief Michelle Bachelet called for states to place moratoriums on the use of AI systems until adequate safeguards could be put into place.
This followed the recently-published OHCHR report which identified several issues with current AI effects on human rights, including state failure to carry out due diligence and opaque data collection and storage.
The Australian Human Rights Commission also called for a moratorium on the use of facial recognition in policing back in May.
We all want to get back to any semblance of ‘normalcy’ as soon as possible, and that involves opening borders and innovation in quarantine measures. But there are too many unanswered questions around the home quarantine app that remain unaddressed by firm and particular legislation.
As Bachelet summed up, “We cannot afford to continue playing catch-up regarding AI – allowing its use with limited or no boundaries or oversight and dealing with the almost inevitable human rights consequences after the fact.”
Cover photo by Jona Novak on Unsplash
Follow Maddie’s journalism journey on Twitter.