Medcraft on ethics and algorithms


James Riley
Editorial Director

Businesses and government must be held accountable for their use of computer algorithms and big data and transparency should be the first priority, ASIC chairman Greg Medcraft said.

Addressing the FINSIA Regulator Panel in Sydney late last week, the corporate regulator boss called for more responsibility with the use of algorithms that impact people’s’ lives, and also outlined ASIC’s own plans to utilise big data responsibly.

The use of computer algorithms in business and government decision-making has been placed under scrutiny recently, with questions arising over a lack of transparency and the potential for discrimination and marginalisation.

Greg Medcraft: Businesses and agencies deploying algorithms must be accountable and transparent

Mr Medcraft said organisations or agencies employing the algorithms need to be accountable, and should be transparent.

“As we move into ever-increasing automation and machine-based decision making, we need to make sure we do not neglect the issue of accountability,” Mr Medcraft said.

“No matter how far technology and innovation goes, and how intelligent machines become, we need to be conscious of accountability and where risk is placed.

“We cannot use technology platforms or algorithms to simply shift risk to the consumer or other areas of society. It can’t be like, ‘the algorithm ate my homework’.”

There are three core areas that need to be addressed when using computer algorithms, Mr Medcraft said: there must be a “responsible person” overseeing it; the decisions made by it need to be easily explained; and there needs to be avenues to appeal the decision.

“For any algorithmic system, there needs to be a person who is responsible for its design, and its outcomes. Automated decisions must be able to be meaningfully explained to customers, to the regulator and to any other interested stakeholders,” Mr Medcraft said.

“If and when algorithms make mistakes, whether because of data errors in their inputs or because of issues with their design, there need to be avenues for redress,” he said.

The comments follow a series of controversies relating to the federal government’s own use of automated algorithms to make important individual decisions.

Most recently, it was revealed that federal government had quietly rolled out the use of a computer algorithm to determine the security risk of detainees at its immigration detention centres.

The use of automated decisions was also placed front and centre during the robo-debt fiasco earlier this year, with thousands of letters automatically sent to Australians incorrectly claiming that they were overpaid welfare payments and now owed money.

The government had also originally planned to use a “data-driven profiling tool” to determine which three sites to undertake drug testing of welfare recipients, but this was put on hold after Data61 pulled out of the plan.

The government’s use of algorithms for these means has been criticised by experts, with it failing to meet the three core requirements as set out by Mr Medcraft.

UTS Business School associate professor Bronwen Dalton described the government’s recent use of computer algorithms and big data as “problematic”.

“The ethics of this approach should be seriously considered as the consequences of getting such high-stakes decisions wrong can be devastating for those inaccurately targeted. We must ask ourselves if it is acceptable to assess people as members of a statistical group rather than as individuals when the stakes are so high,” Ms Dalton told InnovationAus.com.

Outgoing Victorian Privacy Commissioner David Watts also slammed the government’s use of algorithms, saying it lacked transparency and a means for appeal.

“We don’t know whether the process is legal or not because it’s not transparent. We don’t know how the algorithm works, we don’t know what data is being assessed and how that is going to influence the decision making,” Mr Watts told InnovationAus.com.

“There can be great uses of big data but this one seems to be lacking transparency. There’s just a whole range of risks that we don’t have answers for.”

Government departments and agencies are increasingly turning to algorithms to save money and time, but time needs to be taken to ensure this is done safely and transparently, he said.

“A lot of federal government departments are interested in using big data to introduce administrative efficiencies, reduce complexities and speed up decision making. They’re looking to big data solutions to do that.

“My fear is that just like any other time-saving device, it’s so new that the ethics and legality of it haven’t been properly assessed and understood,” he said.

While warning that the proper precautions need to be in place, Mr Medcraft also outlined how regulators need to embrace the “opportunities and efficiencies” that big data can provide.

The organisation’s program of transformation – called One ASIC – is about better “connecting the dots” to make the organisation a more “pre-emptive and proactive regulator”, he said.

“It is about working together seamlessly to use our data, resources and regulatory tools in the most effective way possible across our organisation,” Mr Medcraft.

He said ASIC would soon launch its own data strategy, outlining why data is important to the organisation, how it will capture, share and use data and its implementation plan, all with the goal of making ASIC a “more data-driven, intelligence-led law enforcement agency”.

Do you know more? Contact James Riley via Email.

Leave a Comment