Algorithms, big data and the govt
Outsourcing decisions: Computer algorithms are an ethical issue in government processes
The Australian Government is increasingly turning to computer algorithms to make important decisions in a move that experts say comes with a number of ethical and legal issues.
It was revealed this week by Fairfax Media that the government had quietly rolled out the use of a computer algorithm in its 13 immigration detention centres to determine the security risks of detainees, replacing decision-making by humans.
The department’s Security Risk Assessment Tool (SRAT) is given data regarding an individual’s behavior during and prior to detention, signs of violent or aggressive behavior and known associates to consider if they “pose an unacceptable risk to the community”.
“It also considers each detainee’s individual circumstances, including age and health. As a result of these and other changes there has been a significant decrease in incidents in detention including assault and self-harm,” an Immigration department spokesperson said.
The automated computer algorithm has been in use since late last year, but was only publicly revealed by former Australian Human Rights Commission president Gillian Triggs last week.
“The use of an algorithm to replace professional judgements – I thought this can’t be true, I must be back in 1984,” Ms Triggs said in a speech last week.
“They pump in statistical details and out comes a response that dictates whether they are in a high-security area or whether they are allowed certain privileges within the detention centre
The use of big data and algorithms to make important decisions without any human oversight is “problematic” both in terms of efficacy and ethics, UTS Business School associate professor Bronwen Dalton said.
“One must acknowledge that history is not destiny. The algorithm fails to take into account any genuine rehabilitation the person in question may have achieved, any changes in the state of mental health, or those that may not have a history but could be assessed as likely to be dangerous in the future,” Ms Dalton told InnovationAus.com.
“Such outcomes can be more accurately determined by mental health professionals.”
“The ethics of this approach should be seriously considered as the consequences of getting such high-stakes decisions wrong can be devastating for those inaccurately targeted,” she said.
“We must ask ourselves if it is acceptable to assess people as members of a statistical group rather than as individuals when the stakes are so high.”
SRAT poses legal concerns which are impossible to properly answer due to a lack of transparency, outgoing Victorian Privacy Commissioner and professor of law at La Trobe University David Watts said.
“We don’t know whether the process is legal or not because it’s not transparent. We don’t know how the algorithm works, we don’t know what data is being assessed and how that is going to influence the decision making,” Mr Watts told InnovationAus.com.
“What we have here is a non-transparent process where we don’t know whether the automated process that’s been put in place is consistent with the law.
“There can be great uses of big data but this one seems to be lacking transparency. There’s just a whole range of risks that we don’t have answers for.”
The algorithm is in use across Australia’s 13 immigration detention centres, and could hinder detainees and their lawyers’ attempts at appealing decisions.
“There’s no document trial, there’s just an input of a name into the algorithm and a decision is spat out, so how do lawyers know if you’ve been assessed correctly?”
“The fact that we don’t know how these things work means the rules can be changed without anyone knowing about it,” Mr Watts said.
It’s not the first time the government has turned to computer algorithms and big data to make important decisions and cut costs.
The robo-debt program, which save automated letters sent to thousands of Australians incorrectly claiming they were overpaid welfare payments and now owed money, has been widely criticised, with its system inaccurately matching data from the ATO.
The government has now agreed to return a level of human oversight to this decision-making process at Centrelink.
Government also originally planned to develop a “data-driven profiling tool” with Data61 to pull together various data sets into an algorithm in order to select three test sites for its welfare drug testing pilot.
But it was revealed this month that Data61 never signed the contract and decided to not go ahead with the plan.
Mr Watts said this is a wider trend across government in an effort to save time and money.
“A lot of federal government departments are interested in using big data to introduce administrative efficiencies, reduce complexities and speed up decision making. They’re looking to big data solutions to do that,” he said.
“There’s nothing new about algorithms, but what is new is the fact that it can all be automated now, and that’s why lots of federal government departments are trying to do it now. My fear is that just like any other time-saving device, it’s so new that the ethics and legality of it haven’t been properly assessed and understood.”
There is a place for the use of computer algorithms in public policy and decision making, but this needs to be done with proper transparency and safeguards, he said.
“There is room for having automated processes to help you make decisions. The question is at what point is that reviewable by a human being to ensure that it actually complies with the law?
“There can be great uses of big data but this one seems to have a lack of transparency, and there are a whole range of risks that we don’t have answers for,” Mr Watts said.
These prediction systems are known as “actuarial risk assessment instruments”, Ms Dalton said, and are becoming increasingly appealing for governments.
“The pursuit of computational prediction methods has become big business. Companies are approaching jurisdictions all over the world promising accuracy and that these tools will save government money,” Ms Dalton said.
“But if these models fail to predict and wrongly penalise, are these savings worth it?” she said.