Getting on the policy front-foot


James Riley
Editorial Director

Government must get on the front foot in embracing and regulating new technologies to ensure Australia can be a “test bed” for the world’s great innovations, Assistant Innovation Minister Craig Laundy says.

Speaking on Q&A’s ‘innovation special’, Mr Laundy countered the “doom and gloom” concerns about automation, AI, robots and job losses, to say that if the education system and governments can adapt, there are many opportunities on offer.

“We as a government need to be very smart about how we regulate, and it needs to be light-touch,” Mr Laundy said. “With 24 million people here we are the perfect test bed for the world’s technology. We could be a sandbox and let overseas companies come here, play and innovate.”

Innovation Special: Raising more questions than answers on Q&A

“But we can only do that if we’ve got a good vision of what’s on the horizon now, and what’s coming at us. It’s not be feared; it’s to be embraced,” he said.

Mr Laundy was joined on Q&A by Labor’s digital economy spokesman Ed Husic, drone expert Dr Catherine Ball, Sydney Business Insights lead Sandra Peter, and the University of Sydney ambassador for maths and science Adam Spencer.

The program discussed future of work issues, education in the digital world, the government’s role in regulating, and the ethical dilemmas of driverless cars.

Education will have to quickly evolve beyond its traditional form in order to prepare people for the work of the future, Mr Laundy said.

“People in school now will have an average of 17 jobs in their lifetime, and five [different] careers. University ain’t going to be like it is today moving forward,” he said.

“That fixed three or five year trend, you’ll see the rise of re-skilling and lifelong learning, with micro-credentials.

“Training will adapt your skills at that point to target them at another [career role]. And this will change the nature of how our education system works as well,” he said.

“It’s about aiming you in the right spot and if that door closes, redirecting you to one that’s opening up.”

Mr Husic said it was important to prepare all Australians for the digital changes that are coming, to ensure no-one is left behind.

“This is an important conversation to have. I know that the needle moves relatively quickly from awe to anxiety with technology, and we always have to be very careful about that,” Mr Husic said.

“It’s not just about the coding, it’s about the way in which you’re encouraging young students to engage with technology, to think about it. Coding opens up that pathway in the minds of young students.”

For Dr Ball, 21st century skills like coding are crucial, and should be taught from a young age.

“Coding is absolutely going to be a key skill. If you can write code, you’re going to be pretty much guaranteed a job. Coding is the new ‘learning French’. The idea is of skills, rather than actual jobs,” Dr Ball said.

Both politicians agreed that all levels of government in Australia need to be on the front foot with regulating emerging technologies and embracing innovation. Mr Husic added that the private sector also needs to improve its digital engagement to avoid the current situation with Amazon.

“We scramble for a response, but we know this is coming,” Mr Husic said.

“We know we can use technology better, but [for some reason] we wait until the very last moment, and then we have to turn to government to step in and sort out the regulations,” he said.

“It really should have been a two-way conversation. There’s a way businesses and government can work together to do better for the community.”

Mr Husic pointed to artificial intelligence as an area that should be carefully investigated right now.

“It’s very woolly around the boundaries for AI, with how governments should be stepping forward. In Australia we are a trusted voice on the world stage and we should be putting that voice to use and getting other nations thinking about how AI will be developed and used,” he said.

“We shouldn’t just let this stuff go along its way and then scramble for a response [at the last minute],” he said.

Ms Peter also touched on another issue surrounding AI which has been especially relevant this year, with the in-built biases in algorithms that are used to make important decisions.

“These biases are not of our own making, we don’t train them to be biased, but they’re modelled on the real world,” Ms Peter said.

“They creep into how we get our loans, who gets a job, who gets paroled and who goes to jail. That’s the real fear with AI, that these sort of biases are built in – we don’t know about them and we don’t know how to fix them,” Ms Peter said.

The government’s use of big data and algorithms has been placed under scrutiny this year, including during the robo-debt debacle, the welfare drug testing trials and in its immigration detention centres.

The panel also grappled with the current debate over driverless cars, and how they should be programmed to react in certain situations.

“Who should die in the case of driverless cars? If I’m driving down a road and a little child runs in front of the car, should the car kill me and drive me into a pole to save the child? The problem is who is making that decision? How is there a right decision there?” Dr Ball said.

Mr Laundy said this is the “perfect overlay” of an area where government needs to get involved with regulations, and to also tackle the moral and ethical dilemmas.

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories