Federal government guidance developed to provide public servants with a steer on the use of generative artificial intelligence entrusts agencies as the “arbiter of what is acceptable”, according to Australia’s peak IT body.
As the findings of the robodebt Royal Commissioner were released, the Australian Information Industry Association (AIIA) has also called for a more extensive AI framework to become mandatory for agencies involve in citizen-facing service delivery.
On Thursday, the Digital Transformation Agency revealed the interim guidance developed in tandem with the Department of Industry, Science and Resources to supplement agency-specific policies, while a whole-of-government position is developed.
The guidance warns public servants for the first time that the use of generative AI tools like ChatGPT, Bard AI and Bing AI for service delivery and decision-making poses an “unacceptable risk to government”.
“Government information must only be entered into these tools if it has already been made public or would be acceptable to be made public. Classified or sensitive information must not be entered into these tools under any circumstances,” the guidance states.
But the advice remains voluntary, with public servants only asked to deploy the technology “responsibly”, while recommending that agencies create a mechanism to register and approve user accounts to access the tools.
AIIA chief executive Simon Bush welcomed the guidance, which he said was pleasing to see given AI’s potential to revolutionise a “large range of government services in coming years as adoption grows”.
“A careful and considered approach to its adoption is sensible and we welcome the iterative and principles-based approach and government should consult with industry on further guidance as the technology matures,” he said.
But Mr Bush said the interim guidance “leaves agencies to make their own decision and the arbiter of what is acceptable”, challenging the government to go much further by mandating a “mature AI framework” for certain activities.
“The AIIA… has been calling on the federal government to develop a detailed and transparent framework that is compulsory for agencies in citizen facing use of AI that involves assessment of eligibility for services and other high-risk areas of AI usage.”
Mr Bush said the AI framework developed by the New South Wales government prior to the arrival of ChatGPT last year – which is now in the process of being refreshed – makes clear expectations on agencies through a “detailed checklist”.
But with the interim advice developed by the federal government, it is “unclear whether individual agencies have their own adoption frameworks and governance models”, with agencies only encouraged to implement agency-level policies.
“More work needs to be done to ensure that mature AI frameworks are not an optional consideration but rather a compulsory check point for safe and principled adoption of AI by government organisations,” he said.
“We have seen with robodebt that safeguards, governance and transparency are required and the AIIA encourages the government to more fully develop a governance framework that is compulsory and not merely guidance.”
The Royal Commission into Robodebt’s final report, released on Friday, raised questions about the ongoing use of automated decision-making in the Australian Public Service, recommending the government consider legislative reforms and a body to monitor and audit automated decision-making.
The commission said the reforms could include “aspects of the Australian AI ethics principles”, which were developed by CSIRO’s digital arm Data61 in 2019 to guide the development, adoption and use of AI systems.
Mr Bush said that “strong guardrails” are needed to “safely adopt AI and build confidence in the community” and that the AIIA will continue to provide feedback to government on the AI and generative AI.
“AI offers incredible opportunities for our economy, it is important our governments develop strong frameworks that build confidence and harness these possibilities,” he said.
Do you know more? Contact James Riley via Email.