Data sharing not for faint hearted

James Riley
Editorial Director

The NSW Government chief data scientist Ian Oppermann has been developing governance frameworks for data sharing in national and international forums for the past two and a half years.

This is not everyone’s cup of tea. But with the launch via the Australian Computer Society of a whitepaper, Privacy in Data Sharing: A guide for business and government authored by Dr Oppermann, Australia is a step closer to an overhauled system for putting risk mitigation strategies around data sharing exercises.

Data sharing is a hot topic, albeit it one that is not well understood in the general community. As this paper notes, since the beginning of the internet and the development of cookies, we have all been generating data about personal interests and preferences through our web browsing and online purchases.

The proliferation of mobile devices and the rapid growth of social networks has unleashed unprecedented troves of information about locations and relationships, together with life events, plans, personalities and purchases.

The conceptual notion of sharing data is appealing, for unlocking not only commercial value, but valuable insights for service delivery.

Dr Oppermann has been working within the International Electrotechnical Commission, as well as a collaboration between the IEC and the International Standards Organisation called the JTC One – a joint technical committee that has been running for 50 years.

This has been hard work over a long period. But the foundational assurance frameworks that are being developed should give government organisations in particular some comfort about the data they open to public access.

A year ago, in a separate paper, Dr Oppermann had outlined a ‘Five Safes’ framework via the ACS. In effect, it spelt-out a conceptual understanding of the five areas where an organisation must understand the nature of its data handling.

  1. Safe People – refers to the knowledge, skills and incentives of the users to store and use the data appropriately
  2. Safe Projects – refers to the legal, moral and ethical considerations surrounding the use of data
  3. Safe Setting – refers to the practical controls on the way the data is accessed
  4. Safe Data – refers primarily to the potential for identification of the data
  5. Safe Outputs – refers to the residual risks of publishing the data

Dr Oppermann says the Five Safes is an easy way to conceptualise the issues. If each of the five areas is 100 per cent safe, then perfection has been attained. But life is not full of absolutes.

The real work has been on developing methodologies for understanding where different parts of the model are not perfect, and how you can assess risk against this model.

“A lot of people have been looking at the Five Safes, but it’s currently a conceptual model. If you have a safe person working on safe project with a safe platform with safe data to produce safe outputs, then everything is great,” Dr Oppermann told

“So conceptually it works very well, but that’s not how [the real world works]. So we have been trying to explore what it means between the black-and-white of complete care versus no care, and completely screened people and processes versus really releasing data to anybody who has a desire to access,” he said.

Dr Oppermann will join the Dataconomy 2018 forum in Sydney on the morning of December 12. You can find out more about this important event here.

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories