eSafety chief readies for online content crackdown


Denham Sadler
National Affairs Editor

The eSafety Commissioner has opened consultations on a range of new measures to crackdown on social media firms and other platforms to block underage individuals from accessing online pornography and material classified to be R18+ or higher.

The eSafety Commissioner is ramping up its powers following the passage through Parliament of the Online Safety Act earlier this year, with the new scheme to come into effect from 2022.

The Act requires an update to the restricted access system declaration, which requires content deemed to be R18+ or higher to be behind a restricted access system so it cannot be accessed by children.

This update will broaden its scope to cover a wide range of online platforms, email and messaging services and Australian hosting service providers.

Julie Inman-Grant
Julie Inman Grant: Australian eSafety Commissioner

The Commissioner is also looking at a potential age verification system targeted specifically at preventing anyone aged under 18 from accessing pornography, with a new roadmap to be handed to the government next year.

The roadmap will look at technological approaches for online age verification, and comes just months after the federal government agreed in principle to use its digital identity scheme to verify the ages of individuals before they access online pornographic or gambling sites, but there will be no action on this until 2023.

Under the now-passed Online Safety Act, the Commissioner will have the power to issue take-down notices to a range of services in relation to harmful online content. For other content, the Commissioner will be able to issue a remedial notice, and the tech firm will have to either remove it or ensure it is subject to a restricted access system.

If this meets the minimum standards set out in the Restricted Access System (RAS) declaration, then the material does not have to be removed.

Digital Rights Watch program lead Samantha Floreani said careful and measured consideration of these complex issues is required.

“We want to see an evidence-based and harm reduction approach to these issues, not heavy-handed restrictions and removal of content based on a moral standing. The Online Safety Act is a clear example of what happens when regulation focuses too much on content over context,” Ms Floreani told InnovationAus.

“We hope that through these consultations, albeit rushed, the eSafety Commissioner will meaningfully engage with the community in order to develop policies which protect people, rather than morals.”

The discussion paper outlines how the RAS will be updated to expand the scope of services it applies to, including social media firms, email companies, instant messaging services and online games.

It will also apply to Australian hosting service providers. Submissions on the new RAS will close on 12 September.

The new age verification requirements will not be limited to material hosted or provided in Australia, but will cover “commercially produced and user-generated sexually explicit material” being accessed by Australians. It is also not limited to material that falls under the remit of the RAS.

The Commissioner will present an online age verification roadmap to the government next year.

“Inappropriate content like violent or extreme pornography that young children may encounter by accident can be distressing and even harmful, while for older children who may seek out this material, the risk is that it will give them unrealistic and potentially damaging ideas about what intimate relationships should look like,” eSafety Commissioner Julie Inman Grant said.

“Age verification, as overseas experience has shown, is a complex issue, so it is important that all sections of the community are able to be heard. We will take a considered, evidence-based approach that takes into account feedback from industry, stakeholders, experts and the public, to find workable solutions.”

Ms Floreani said the proposal to use digital identification and facial recognition for age verification is particularly troubling.

“We are concerned that the technological approaches to age verification will present an unreasonable invasion into people’s privacy, as well as possibly creating significant security risks,” she said.

“It was only a few years ago that the Department of Home Affairs suggested the use of facial recognition technology for age verification to access online pornography. The proposal to use such an invasive and error-riddled technology to regulate access to such sensitive content is alarming.”

Last week the eSafety Commissioner also opened consultations on the Basic Online Safety Expectations (BOSE) for large social media firms like Facebook, outlining core and additional expectations around the cyberbullying of children and cyber-abuse of adults.

Following concerns that the introduction and passage of the Online Safety Act was rushed, running several important consultations at the same time and for only a month is troubling, Ms Floreani said.

“We need well thought out, considered regulation and policy to deal with the rise in hate speech, online abuse and bullying. But without meaningful public consultation, the balance won’t be right,” she said.

“The result of the rushed consultation process is legislation that contains broad, ill-defined powers for the Commissioner, and lacks the kind of nuance we need to ensure we don’t end up creating harm while trying to prevent it.

“Running simultaneous public consultations on both the BOSE as well as the Restricted Access System, both due within two months, continues the trend of unreasonably fast turnaround times which makes it exceptionally challenging for concerned individuals and organisations to meaningfully participate.

“There is also no space given to engagement beyond written submissions, no roundtables or workshops with the community.”

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories