The core ethics of data analytics


James Riley
Editorial Director

Australian businesses and governments that want to exploit the value in personal data need to first exercise their “moral imagination” because tick-a-box compliance is failing.

Speaking at Data Privacy Asia Pacific in Sydney yesterday, The Ethics Centre executive director Simon Longstaff warned that; “When technical mastery is divorced from ethical intent, you get tyranny.”

Just because organisations could access and make use of vast reservoirs of personal data did not mean that they should.

Simon Longstaff: Technical mastery divorced from ethical intent is tyranny

Dr Longstaff said that organisations needed “moral imagination” about the impact of using data in novel ways, and posited the development of an “ethics skid pad” which would allow organisations to consider what could happen depending on how personal data was used before deploying such solutions.

He warned that; “The old model of compliance where regulators sought restrictive rules decreased the capacity for ethical judgement,” and that organisations needed now to apply their own ethics lens when developing new applications.

Organisations however still need to navigate the significant regulatory hurdles regarding personal data use, including compliance with the National Privacy Principles, and also to develop processes to deal with mandatory data breach notification, which comes into force next February.

In addition, Australian enterprises which handle the personal data of anyone in Europe needed to prepare for the General Data Protection Regulation which comes into force next May.

Non-compliance with elements of the GDPR could result in fines of up to 4 per cent of global revenues, regardless of where the organisation is domiciled – though policing the regulation outside of Europe could prove challenging.

The conference attracted privacy professionals from around the region along with experts from Europe and North America.

The need for an ethical rather than regulated framework to guide organisations regarding the use of personal data was reinforced by Peter Cullen, from the Information Accountability Foundation. He said organisations; “Need to leverage data to provide value – but in a way that responsibly protects it.”

The framework is an attempt to help steer organisations through the decision matrix associated with responsible use of data and build trust with consumers and citizens.

It’s not the only such effort underway. Data Governance Australia recently released a draft code of conduct regarding how its members should make use of personal data.

In an address to the National Press Club yesterday, DGA chairman Graeme Samuel noted that; “For every innovation with data, there is a corresponding deviation. Nothing exemplifies this more than the Centrelink and ATO data-matching program, with allegations of over payments to individuals based on very simplistic and inappropriate data matching.”

He also acknowledged the risk of the private sector overstepping the mark in its enthusiasm to use personal data to develop new products or services. More regulation, however, is not the answer according to Mr Samuel.

He said that to date; “Issues concerning the use of data by companies have been viewed and managed largely as a function of legal compliance. The emphasis has been on staying safely inside the rules, not on delivering the best outcome to consumers. This approach is not viable in the long term and will stifle innovation.”

Regulation should only be a response to market failure he argued.

Rob Sherman, deputy chief privacy officer for Facebook, who was a panelist at the Data Privacy conference said that the ethical framework approach was the only workable approach.

It was not possible to be prescriptive about how data should be used as that risked limiting both the benefits to the consumer or citizen, and the benefits to the corporation or Government using the data, Mr Sherman said.

Regulation would also struggle to keep pace with technology developments he said, and the looming impact of big data, artificial intelligence (AI) and machine learning on privacy was noted by a series of speakers at the conference.

Simon Entwistle, deputy commissioner of the UK Information Commissioner’s Officer, noted that as machine learning and AI took hold it would be increasingly difficult to know how people’s data was being used, or whether deidentified data was being transformed into personal data.

“Taking an ethical approach is more important than ever – to go beyond compliance to the underlying legal obligations,” he said.

While large organisations like Facebook may have assembled substantial ethical teams and frameworks to assess how the organisation should leverage users’ data, Mr Entwistle said that; “For smaller organisations this can be done without need to implement cumbersome processes.

“The Sunshine Test – will it strengthen the organisation or threaten it if the details are made public? And the Granny Test – would I be comfortable if an organisation was using my grandparents’ data this way? These are simple tests you can apply.”

Getting it right is important according to Timothy Pilgrim, Australia’s information and privacy commissioner who said that; “Data innovation and personal data protection are fiercely interdependent,” without which trust would be eroded.

He said that a 2017 survey revealed that 58 per cent of Australians had avoided a business because of concerns regarding how it might use personal information; and 44 per cent had not downloaded an app for the same reason.

Asked whether they were comfortable with the government using personal information for policy or service development, only 46 per cent of respondents agreed.

Mr Pilgrim also noted that 86 per cent of respondents believed secondary use of personal data (for a purpose other than it was originally gathered) represented misuse.

“A successful data driven economy needs a strong foundation in privacy,” he said.

Do you know more? Contact James Riley via Email.

Leave a Comment