The government’s Online Safety Bill gives “excessive discretionary powers” to the eSafety Commissioner and has the potential to provide cover for law enforcement officials, and should be scrapped entirely, according to the Australian Lawyers Alliance.
The legislation, which is currently the subject of a whirlwind Senate inquiry after being introduced to Parliament in last month, extends the eSafety Commissioner’s takedown scheme to Australian adults, allows the office to issue removal notices for content deemed to be rated as R18+ or higher, and to then order the site or app be blocked if it does not comply.
The bill also introduces the Abhorrent Violent Material Blocking Scheme, which gives the commissioner the power to order ISPs to block domain names, URLs and IP addresses, and will allow the minister to set “basic online safety expectations” for tech companies.
A number of legal, civil and digital rights, tech companies and adult organisations have raised significant concerns with the legislation, and its potential to impact those working in adult industries, to lead to online censorship, and the vast powers it hands to a handful of individuals.
Despite this, the legislation was introduced to Parliament just 10 days after the government received nearly 400 submissions on the draft bill, and the senate committee is expected to deliver its report nine days after submissions closed.
Stakeholders were also given only three working days to make a submission to the inquiry.
In a submission to the inquiry, Australian Lawyers Alliance (ALA) president Graham Droppert said the government should not proceed with the legislation because it “invests excessive discretionary power in the eSafety Commissioner and also the Minister with respect to the consideration of community expectations and values in relation to online content”.
“The ALA considers that the bill does not strike the appropriate balance between protection against abhorrent material and due process for determining whether content comes within that classification,” Mr Droppert said.
There are also serious issues around handing the minister the power to set “basic online safety expectations” for a range of tech companies, Mr Droppert said.
“There is a significant risk that this will result in excessive proactive monitoring and removal of content that falls under Class 1 and 2 of the National Classification Code. It is a dangerous centralisation of power for such a broad discretion to be invested in one person, who would be determining community expectation,” he said.
This reliance on “outdated” classification codes is also troublesome, the ALA said, the legislation should not proceed until these are reviewed and updated.
The abhorrent violent material blocking scheme, which gives the eSafety Commissioner the ability to order internet service providers to block domain names, URLs or IP addresses could lead to important conduct being censored, the organisation said.
“The ALA submits that the scheme as currently provided for in the bill has the potential to provide cover and protection for law enforcement officials to use excessive force out of sight from those who might seek accountability,” Mr Droppert said.
“It is essential that this scheme not be used to hide any use of violence by the government, including by its law enforcement officials, and any abuses of human rights.”
A number of sex industry organisations made submissions to the inquiry raising concerns that the new powers would lead to the blocking of content of consenting adults.
“The existing classification scheme of online content fails to distinguish between harmful content and content of a sexual nature depicting consenting adults. Any expansion of the power of the eSafety Commissioner to take down content that is not harmful constitutes an erosion of freedom of expression,” Sex Work Law Reform Victoria president Lisa Dallimore.
“It is concerning that ill-conceived clauses [in the bill] could set back hard-fought giants in sx work regulation which have largely created a safer working environment for sex workers.”
Digital Rights Watch has been leading the charge against the legislation, with detailed submissions and calls to action.
The powers to be handed to the eSafety Commissioner, which was established in 2015 to focus on keeping children safe online, is a continuation of its broadly expanding remit, and should be cause for concern, Digital Rights Watch programme director Lucie Krahulcova said.
“The new powers in the bill are discretionary and open-ended, giving all the power and none of the accountability to the eSafety Office. They are not liable for any damage their decisions may cause and not required to report thoroughly on how and why they make removal decisions. This is a dramatic departure from democratic standards globally,” Ms Krahulcova told InnovationAus.
“There seems to be a lot of political willingness to trust the eSafety Commissioner to act in good faith and stick to the intention of the legislation rather than explicitly define and limit the powers in the bill,” she said.
“This is a naive trust in the cult of personality and risks that under any administration these powers will be misinterpreted and used to bully and marginalise individuals and movements.”
The bill “introduces provisions for powers that are likely to undermine digital rights and exacerbate harm for vulnerable groups”, Digital Rights Watch said in the submission.
The inclusion of class 1 content – that which would be refused classification, and class 2 – that deemed to be X18+ or R18+ – means the blocking scheme would include all sexual content, whether it is violent or not.
“The scheme is likely to cause significant harm to those who work in the sex industry, including sex workers, pornography creators, online sex-positive educators and activists,” Digital Rights Watch said.
The abhorrent violent material blocking scheme included in the legislation is “overly simplistic and overlooks complex underlying issues”, with Digital Rights Watch raising similar concerns that it would lead to the censorship of content that is in the public interest.
“In some circumstances, violence captured and shared online can be of vital importance to hold those in power accountable, to shine the light on otherwise hidden human rights violations, and be the catalyst for social change,” the organisation said.
“It is essential that this scheme not be used to hide state use of violence and abuses of human rights.”
The minister’s ability to outline “basic online safety expectations” for social media firms, “relevant electronic services” and designated internet services, is too broad and could lead to troublesome monitoring of content, the organisation said.
“When drafted so broadly, these expectations incentivise proactive monitoring and removal of content that falls under Class 1 or 2. Given the immense scale of online content, tech companies generally turn to automated processes to determine which content is or isn’t harmful, despite evidence that content moderation algorithms are not consistent in identifying content correctly,” Digital Rights Watch said.
Digital Rights Watch has called for amendments to the legislation, including for the introduction of a sunset clause so the powers are reviewed before they are renewed, a multistakeholder oversight board to review takedown decisions, better transparency over the categories of takedowns and complaints, and a meaningful appeals process.
The legislation will establish a “world-first cyberabuse take-down scheme” for Australian adults, communications minister Paul Fletcher said in Parliament last month.