Technology lessons of COVIDSafe must be learned


OPINION: As the government turns its thinking to a vaccination passport or similar, it would do well to learn some of the lessons from COVIDSafe.

COVIDSafe illustrates the need for a good understanding of both policy intent and how the technology works.

Aimed at ‘help[ing] assist health officials understand and contain the spread’ of COVID-19, the app is more than a benign study assistant.

It uses people’s smartphone Bluetooth functionality as an incomplete proxy for distance between potential carriers and other people. Collected data provides an approximate, near real-time social network.

Lesley Seebeck
Dr Lesley Seebeck on the technology lessons of COViDSafe

That can be both helpful to understanding the spread of infection and highly invasive of privacy. Like all such data, stripped of context, it can be highly misleading and potentially open to abuse.

So, kudos to the government, which realised it needed strong privacy provisions around COVIDSafe data.

However, it’s not enough to do good on the legislative side alone: government needs to follow through on the technology and implementation.

A fundamental lesson of running a technology shop is that most apparently technical faults aren’t technical in nature but organisational.

A ‘technical failure’, for example, may expose how after a reorganisation, no-one had assumed responsibility for backing up key systems, or that a long-departed contractor had hard-coded passwords.

When these events occur, best practice in a mature technology organisation is to run an all-hands retrospective, after-action review, or blameless post-mortem to understand what happened, to identify systemic issues and a way ahead.

Such exercises are not to find scapegoats but to ask ‘why’ until a root cause is found.

Typically, it takes several iterations, digging progressively deeper. And it requires a healthy, open, informed, constructive, and respectful culture. After all, technology done well is hard; the technology savvy organisation will seek opportunities to improve.

The report by independent researchers Richard Nelson, Vanessa Teague, Jim Mussared and Geoffrey Huntley probably offers the best insight into COVIDSafe’s technical, privacy and security issues.

The organisational issues can only be inferred. For example, the hurried implementation that led to oversights on privacy issues, the prioritisation of user interface changes over privacy and security issues, the choice of security advisories by the Digital Transformation Agency, and persistence with a notification protocol that required workarounds that in turn introduced inconsistencies.

Without digging deeper, root causes can only be guessed at.

What is clear, however, is that the government’s technology competence could be improved, at both the technical and organisational level. As policy is increasingly shaped and implemented through technology, that’s needed.

First, people. Twenty years of outsourcing, combined with a continued erosion of public service knowledge and rapid technological change, makes it hard to argue, aside from some niche areas, that the government is an informed judge of technology.

But more ‘techies’ alone won’t help much. Technologists need to be exposed to the complexity of policy and delivery; policy and program managers need to understand the nature, opportunities, constraints, and weaknesses of technology.

Ministers, too, have a responsibility to be much more familiar with technology than they are now. They need to learn to avoid optimism bias, be wary of vendor promises, and be willing to listen to the practicalities of complex design and implementation.

Second, process. Given the fusing of policy, programs and technology, government needs an appropriate means of oversight—one that has a good grasp of technology and the economic, societal, and national security implications in design, implementation, and operation.

That role cannot be outsourced to either vendors and consultancies, or the national security community, which rightly has a singular focus but typically lacks an empathetic citizen perspective.

The government may consider its own lab for the express purpose of understanding and testing technologies, including hardware and algorithms.

To understand technology well, one must build and run it. There is a precedent from the days when we took such technical expertise seriously: Lucas Heights was built to ensure Australia retained expertise in the nuclear fuel cycle.

Technology is an integral part of people’s lives, society, the economy, and government. It’s going to have a continuing role in how we live with COVID.

Best that government takes it sufficiently seriously to get it right, rather than simply acting to meet political expedience.

Dr Lesley Seebeck is an Honorary Professor at the Australian National University and the former chief executive of the ANU Cyber Institute. She has worked in senior roles across government including as Chief Investment and Advisory Officer at the Digital Transformation Agency.

Do you know more? Contact James Riley via Email.

3 Comments
  1. Linden 4 weeks ago
    Reply

    Having worked on Australia’s first eHealth integration (BlueBook app for NSW Health), I can tell you that the companies and people charged to lead such projects are often very out of their depth. That project had the former Head of NSW Health on their board as a director (perhaps how the contract was achieved?) yet the project was entirely unaware of the risks associated with its work – it did not know that it was integrating with the official eHealth system. I imagine a tender process like this: oh you’re a big business with a reputable brand name for technology stuff, no need to check on your ability when it comes to mobile app development and security.
    So fast-track to 2020, millions of taxpayer dollars being spent on half-baked solutions. CovidSAFE did not align with the global effort to provide APIs (Exposure Notifications API) within mobile platforms to facilitate the secure and private tracking via bluetooth and it differed from other nations in not involving a mixture of experts from the industry, diluting bias, but rather chose AFAIK a single vendor who had very little experience in mobile app development. The initial rollout is proof of that experienced mobile developers were either not on the team or being ignored. It also failed to meet various government software standards, e.g. Accessibility, Copyright abuse, etc. Yeah it abused copyright, may Australian apps do. Easy way to differentiate between good vs useless vendors BTW.

  2. Laurie Patton 1 month ago
    Reply

    It’s not so much that a rushed product had problems, it’s that the bureaucrats didn’t try hard enough to fix it and the politicians didn’t make them.

  3. Rod 1 month ago
    Reply

    The other fundamental reason for not picking up enough cases is that the design based on health advice at the time required a contact to be recorded after 15 minutes of close contact. It didn’t allow for location based tracking or the “fleeting” transmission being identified with the Delta strain. If the threshold for recoding contacts was wound down to one minute or five from 15, more contacts would be identified. There was also a reluctance to include any location based tracking (which could pickup alignment with exposure sites) and the Big Brother fears associated with that also left the tool with further limitations. Perhaps with Delta and the financial and mental health cost of lockdowns, a willingness to exploit location based tracking or at least personal hot spot avoidance or notification might emerge.

Leave a Comment

Your email address will not be published.

Related stories