The Ethics Centre is testing a framework designed to steer business decision-making around innovation and invention, in a bid to ensure that in the race to transform themselves organisations don’t obliterate the well-being or trust of customers and the general public.
According to Ethics Centre executive director Simon Longstaff, technological advance demands great ethical vigilance because; “If you divorce technical mastery from ethical restraint, then you have the seed of tyranny. You see it throughout history.
“People get a new powerful way of doing something – if they do not exercise restraint they become corrupted by it.”
No matter how urgent the race to innovate, Dr Longstaff says every business needs to take careful stock of what is being undertaken, why, and attach to that an ethical framework relating to purpose, values and principles.
“The danger is if they don’t do that they firstly will proceed along a course of action that leads to very serious loss or damage to them or the wider society,” he said.
“When you talk to people engaged in that they tell you that when they look back they can’t believe they did it. That at the time they did it they didn’t see it and the reason they didn’t see it was that everyone was doing it.”
Think ball tampering, bankers’ antics and Cambridge Analytica.
It’s not good enough says Dr Longstaff, who believes that public patience with bad behavior has worn thin, as evidenced by the outcry about Australia’s cricket cheats.
“It’s not so much about cricket – it’s that this rot, this contagion has spread from things like politics into business, churches – and the stain gets larger and larger and now it’s infected cricket.
“People in these organisations forgot the circumstances for which they were established.”
He does not advocate more regulation saying that could in fact stifle innovation – but he does stress that organisations need to use ethical frameworks and build ethical cultures to steer innovation and build trust.
The Ethics Centre has for the last ten months been working on a project to attempt to create just such an ethical framework for the development of all new technologies, “not to be applied after the design but to be built in to the design itself,” says Dr Longstaff.
It’s ethics by design in the same way that privacy advocates recommend privacy by design be built into new products and services from the get-go.
The first version of the framework is being tested currently and is equally applicable “whether you are working on a genetically modified organism or working on AI or big data or a toaster for someone’s house.”
While the framework ensures ethics are considered during the innovation process Dr Longstaff stresses that when it comes to innovation: “Can does not mean ought,” and that human agency must be maintained and exercised when required.
“The changes we face are profound enough to be labelled correctly as introducing civilizational change.
“When you come to things like neural networks and the vast data sets being used – a lot of the time people who are getting results from machine vision and so forth don’t precisely know how it’s happened.
“They know they fed the data in and created vision – but if you said can you follow the logic step by step in the way you can in Boolean logic they would say ‘no we cannot do it’,” says Dr Longstaff.
“It becomes absolutely essential in those circumstances that the ethics be written in at the beginning of the design process,” and that the users of the technology apply their own ethical compass in terms of deployment.
“You never lose your obligation as an innovator to deal with those things which are reasonably foreseeable, and you need to put in place the checks and balances that stop the negative effects that could happen if things are misused.”
At present though he doubts society’s ability to transition to a more automated, AI infused future in either a just or orderly fashion.
“People think it is going to be a bit like today but just a bit different – and that things will sort themselves out. But we need seriously to plan for this and make sure the transition does not just displace a whole lot of people.”
He particularly questions how civil society will be impacted by AI such as the advent of voice activated devices.
“The internet of things and its intersection with security agencies and data manipulation have not really been thought about. You are connected to the internet in a house where your door lock is no longer controlled by key but by a keypad.
“It’s not fanciful to think that a security agency listening in might put two and two together and make 12 and decide to engage in pre-emptive detention by locking you into your own house.
“That sounds like a dystopian view, but all of this is possible just as are all the really fantastic things. They are what we should celebrate and enjoy – but these other negative things I don’t think we are thinking about deeply enough.”
Do you know more? Contact James Riley via Email.