Calls for Australian ‘data code’ to protect children online


Joseph Brookes
Senior Reporter

Australia needs a “data code” for children to ensure popular digital services like Instagram and Tik Tok are not collecting and processing information in harmful ways, according to a cohort of digital rights groups.

Several other countries, including the United Kingdom and Ireland, have recently introduced codes which require online service providers like social media companies to make “the best interest of the child” the primary consideration in developing any service likely to be accessed by them.

A local campaign for a similar code is now being led by Reset Australia, which has released a report showing young Australians overwhelmingly do not fully understand the digital bargain they sign up with popular digital service providers, and young users are often “nudged” into agreeing with inaccessible terms and conditions.

The digital rights group analysed the terms and conditions of 10 popular online services and surveyed 400 16 and 17 yearolds on their experiences using them, finding users would need a tertiary-level education and nearly two hours on average to read the terms and conditions (T&Cs).

Fewer than one in 20 of the young Australians surveyed by Reset Australia said they always read T&Cs. Less than 15 per cent read them most of the time, 38 per cent some of the time, and 45 per cent said they never read T&Cs.

Nine of the 10 providers’ T&Cs required a tertiary degree level education to understand, with the other provider’s terms needing a late high school level of reading to comprehend. All the providers allow users as young as 13 to sign up.

People liking content on phones
Reset Australia is calling for a data code for children after its research showed few young people read or could understand the terms and conditions of platforms like Instagram and Tik Tok.

Co-author of the report Dr Rys Farthing said digital service providers are encouraging Australians as young as 13 to join their platforms but are not making the terms of involvement clear or accessible to them.

“They don’t make their terms and conditions accessible to those younger users,” Dr Farthing told InnovationAus.

“And it’s just ridiculous to think of a service that says ‘hey great children and young people are welcome’. But then doesn’t put in place the provisions and protections to actually ensure that children and young people can meaningfully engage with those platforms.”

Digital service providers are under growing pressure to be more upfront about the business model underlying their “free services”, where user data is typically used to sell targeted adverting, often via a murky system of advertising technology.

Reset Australia’s report found the T&Cs collecting consent for the process are not presented in a way that could help improve understanding, and eight of the 10 service providers analysed used “dark patterns” to nudge young people into accepting them.

“For example, six platforms inferred consent when users click next, and six present ‘data maximising options’ as the best user experience,” the report said.

In Australia, the consumer regulator is currently examining the online advertising ecosystem as part of changes recommended in its landmark digital platforms inquiry. That inquiry also recommended changes to Australian privacy legislation, which are now also underway but have recently stalled.

Dr Farthing said the local law reforms, which include a focus on protecting vulnerable groups like children, is an opportunity to implement a code similar to those in the UK and Ireland.

“Take a page out of that book. Because we know it’s an upstream intervention that can make the digital world better for children and young people,” she told InnovationAus.

“It’s tried and tested because it’s working there, and it creates interoperable policy requirements on these tech companies. It means that actually what they’re doing in Europe they just have to turn on the children and young people in Australia as well.”

The UK Age Appropriate Design Code came began last year and will be enforced this September, requiring service providers to put the “best interests of the child first” when they are designing and developing apps, games, connected toys and websites that are likely to be accessed by them.

Later this year when the UK code is enforceable, the country’s data regulator will begin proactive audits of service providers’ compliance. A lack of compliance could lead to a breach of GDPR, the overarching data processing regulations in the EU and UK which carries significant financial penalties.

Dr Farthing said an Australian code should be similarly legislated or at least regulator-led, because the online service industry had shown self-regulation is not working.

“When it comes to personal data, but particularly when it comes to children and young people’s data, it’s really clear that self-regulation has failed,” Dr Farthing said.

“And these digital platforms and services have been sort of setting their own rules around what requirements are to work out if a young person has consented or not, and what they can and can’t do with young people’s data. They’ve been making those rules by themselves, and they haven’t got a good track record.”

Reset Australia is joined by several other groups in its campaign for a Children’s Data Code, including Unicef Australia, YMCA and The Australian Child Rights Taskforce.

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories