Push for a Royal Commission into ‘robo-NDIS’


Stuart Mason
Contributor

The Royal Commission into Robodebt heard significant revelations about the series of failures that led up to the illegal tech-based scheme. 

Robodebt involved the use of an algorithm to average out a welfare recipient’s yearly income using data from the tax office cross-matched with income reported to Centrelink. If the system identified a discrepancy, a “please explain” notice was sent to the recipient, with the onus placed on them to prove the debt didn’t exist. 

The system was found to regularly incorrectly match this data and create debts that were inaccurate or did not exist at all. The scheme was eventually found to be unlawful, with the federal government agreeing to refund $720 million debts, scrap $400 million in debts and provide $112 million in compensation. 

The Royal Commission has held hearings this year to investigate the scheme and how similar failures can be prevented in the future. 

Marie Johnson
Centre for Digital Business CEO and former head of the NDIA Technology Authority Marie Johnson

A grassroots campaign has been launched pushing for a similar Royal Commission into the National Disability Insurance Agency (NDIA) and the use of technology and automation within it. 

In this episode of the Commercial Disco podcast, Centre for Digital Business CEO and former head of the NDIA Technology Authority Marie Johnson discussed why this Royal Commission is needed, what’s going wrong at the agency and how artificial intelligence and automation can be used as a tool for good. 

“Robodebt and robo-NDIS were created at the same time, within the same bureaucracy by the same people using the same systems and subject to the same whole-of-government automation strategy,” Ms Johnson told the Commercial Disco podcast. 

“Even as the Royal Commission was hearing about robodebt, those practices were still being undertaken in the NDIA.” 

Ms Johnson said that automated assessments are being used at the NDIA to determine funding, eligibility and debts. 

The former Coalition government attempted to introduce independent assessments to the NDIS, involving an interview and then the use of an algorithm to determine how much funding to provide. This was dubbed “robo-planning” by the-then Labor Opposition and advocates, with wider concerns raised with the “robo-NDIS”. 

“All this is being driven by the same algorithms – so we believe that there are grounds for a Royal Commission, or for the robodebt Royal Commission to be extended, because we believe that the terms of reference cater for the Royal Commission to investigate other like-areas to ensure that it never happens again,” Ms Johnson said. 

“I don’t think that anybody truly has an understanding of the scale of the issues in the NDIA at the moment. I think it would really shock people to know that there are activities that are multiples more extreme, if that was even possible, than robodebt.” 

The federal government launched an independent expert review of the design, operations and sustainability of the NDIS, co-chaired by Professor Bruce Bonyhady and Lisa Paul. 

But Ms Johnson said there have already been a number of reviews into the scheme, and the powers of a Royal Commission are needed to get the full picture of the issues within the agency. 

“My concern with this independent review, as well intentioned as it is, is it’s just another review. All of those reviews didn’t get into the issues of robodebt,” she said. 

“The bureaucracy itself frustrated those inquiry processes – the NDIS independent review doesn’t have the power to investigate. There are people I know who might want to come forward and tell their story about what they know, both as providers and staff, and they won’t do that. 

“The independent review can’t give protection such as parliamentary privilege, or the protections of a court, and can’t compel witnesses. They can’t overcome the veils of FOI, secrecy and confidentiality, and it really can’t refer matters for criminal prosecution investigation. There is much that remains hidden, and that’s why a Royal Commission is needed.” 

These sorts of investigations are needed because some of the technologies in question, such as artificial intelligence, can be used as tools for good, she said. It comes on the back of a number of high-profile tech experts signing a letter calling for a pause on artificial intelligence development in light of the rapid growth of ChatGPT. 

“I think that the use of GPT and similar types of technologies can be used for civil liberty action as well. I think it’s useful for everybody to really examine how these are going to be used, and not just to generate articles and regurgitate things on the internet for people’s amusement,” Ms Johnson said. 

“I think there’s far more serious upsides to how these can and should be used, as well as the downsides that we don’t really understand.”

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories