Regulator urged to reject industry’s social media safety codes


Joseph Brookes
Senior Reporter

Child welfare groups are asking Australia’s online safety regulator to reject the industry codes Meta and TikTok say will keep their young users safe, warning the industry-written rules do little to improve safety and trail international efforts.

The Online Safety Commissioner is currently considering registering the industry safety codes developed by internet companies last year to regulate the treatment of certain online content.

Part of the controversial Online Safety Act, the self-regulatory approach has been criticised for offering relatively weak protections after drafts of the eight codes were released last year.

Among them are the Social Media and Relevant Electronic Services Codes, which would set rules for how platforms like Facebook, Twitter and TikTok deal with harmful content and protect young users.

A coalition of child safety groups — made up of the Australian Child Rights Taskforce, ChildFund, Bravehearts, and Reset Australia — on Wednesday asked eSafety Commissioner Julie Inman Grant to not register the proposed codes and meet with the group for advice on alternatives.

“Neither the versions of the Social Media or Relevant Electronic Services Codes strengthen existing safety standards,” the letter to Ms Inman Grant says.

“Rather they appear to document the status quo and note practices that already happen. In some instances, the Codes actually commit companies to lower standards than they are already operating at.”

The group says the proposed social media industry codes and how services are designated under them would lower the safety bar, including allowing some companies to stop their current practices of scanning services for child sexual exploitation and abuse material.

The proposed codes also allow lesser protections than are required in other markets where regulator-led rules are in place, the letter said.

Of particular concern is the Australian approach to default settings for users aged between 16 and 18.

In the UK, Ireland, and California – where regulators or legislators have developed rules for default privacy settings – social media companies are required to apply tougher default privacy setting to the 16- to 18-year-olds.

But the Australian code only requires the default privacy settings for users up to 16, leaving some Australian teenagers “less protected”.

“The [industry code] versions under consideration would allow these platforms and services to effectively ‘turn off’ safety features and options for children in Australia. This would be a weakening of online safety,” the letter said.

The group asks to meet with Ms Inman Grant and for her not to wait on review of privacy law before deciding on the codes.

“Safety and privacy are connected experiences. Where a young person’s account is private, they are not recommended as ‘friends’ or as accounts to ‘follow’ to adult strangers. As Meta found, 75 per cent of all ‘inappropriate adult-minor contact’ on Facebook was a result of their ‘People You May Know’ friends recommendation system. An attempt to lock in lower standards while we wait for a review of the Privacy Act is an unacceptable approach to safety.”

Officials from the Office of the eSafety Commissioner told an Estimates hearing in November that the regulator had provided some feedback to industry on the draft codes, including that privacy restrictive settings for children “would be appropriate”.

The industry codes were due to the Office by November 18.

The Office of the eSafety Commissioner was asked to provide copies of the draft codes to the Senate, taking the request on notice. No response has yet been published, but a version of the draft codes are available online.

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories