‘Substantial gaps’ in Meta’s Australian election policies: Reset Australia


Denham Sadler
National Affairs Editor

Meta’s public statements and efforts around election integrity on its platform in Australia have been “inadequate and incomplete”, and systemic regulation is needed on the issue, according to Reset Australia, after the US tech giant revealed it won’t offer all the same protections in Australia as it did for the American presidential election.

Tech reform advocacy group Reset Australia sent an open letter to Facebook parent company Meta in early May, with 24 questions on federal election integrity. Four days later, Meta Australia head of public policy Josh Machin responded in a public blog.

But this response still left many unanswered questions, Reset Australia’s Dhakshayini Sooriyakumaran said, and showed that the company is not taking this issue seriously enough.

“We appreciate that Meta took the time to respond to our letter, as the public deserves much more detail and depth than what was provided in their initial March 2022 blog post about their election preparation,” Ms Sooriyakumaran told InnovationAus.com.

“However, the substantial gaps in the responses reveal two things. Meta grossly underestimates how much the public are waking up to the harm of Big Tech’s business model and its inordinate influence over our democracy and our individual privacy.

“And only systemic regulation will force untrustworthy companies such as Meta to open the ‘black box’ and be fully transparent about their behaviour and impact.”

Facebook
Credit: Twin Design / Shutterstock.com

Reset Australia’s concerns centre on a lack of transparency around the measures and resources Meta has implemented to address election-related misinformation and disinformation in Australia, and claims that it is not being prioritised compared to measures used in countries such as the US.

There are not yet “adequate regulatory frameworks” in place to address the spread of election misinformation and hate speech on digital platforms such as Facebook, Reset Australia said in the letter.

“In the absence of mandatory transparency measures through binding regulation, untrustworthy companies such as Meta require their behaviour to be closely scrutinised – particularly during crucial moments such as the final week of the election campaign,” the Reset Australia letter said.

The organisation asked Meta how many human content moderators will be bolstering its AI-enabled system for moderating Australian election content. In response, the company said that more than 40,000 people work on safety and security at Meta, and 15,000 of these workers are content reviewers.

The company did not, however, reveal how many of these will be working on Australian content, or where they are located in the world.

“As issues potentially arise during the Australian election, we have the benefit of being able to draw from not just the cross-functional team dedicated to the Australian election but also from any members of our global safety and security team as needed,” the Meta letter said.

Meta also did not directly answer Reset Australia’s question about who the company consulted with over its content moderation policies.

“It is crucial for Meta to be transparent with the public regarding the expertise drawn upon, including that of communities most impacted by mis and disinformation and hate speech, to develop election-related policies,” the Reset Australia letter said.

Third-party fact checking has been put forward by Meta as one of its key efforts to combat election misinformation on Facebook.

But Reset Australia said the company is unable to provide data on the efficacy of this fact-checking or whether it is reaching the communities most at risk from this misinformation.

Meta said it is unable to provide information on how quickly pieces of misinformation are being fact-checked.

“It is not possible to give a timeframe around how long it takes a fact checker to verify content after it is posted on Facebook. This is because content is flagged to fact checkers in a variety of ways, and it is at the discretion of the independent fact checkers as to which pieces of content they review,” the Meta statement said.

But this is a crucial metric of success, Reset Australia said.

“In the modern election contest, speed of fact checking is an essential success measure, not only due to the 24 hour news cycle, the fast moving political agenda, but most of all due to the potential for mis and disinformation to spread virally on social media,” it said.

Reset Australia also raised concerns that Australia is not being prioritised by Meta in the same way that the US was during the 2020 election.

Meta applied a blackout on political advertising in the US on new ads in the lead up to the election, but this will not be implemented in Australia. Meta also placed restrictions on the distribution of live videos relating to election content in the US but has not done so in Australia.

“While we learn lessons from each prior election, no two elections are the same. Working closely with election authorities and trusted partners in each country, and evaluating the specific risks ahead of each election, we make determinations about which defences are most appropriate,” Meta said.

“The limitation of the distribution of live videos was a ‘break glass’ deployed in response to reports of inaccurate claims about the election. The decision on whether Australia’s blackout period for electoral ads should be extended to digital platforms is a choice for policymakers. We have consistently said over many years we support extending this requirement to digital platforms.”

There have been several examples of misinformation and hate speech during the Australian election campaign that should have warranted these “break glass” measures, Ms Sooriyakumaran said.

“In response to questions about emergency measures Meta states that they have ‘not been advised’ that the ‘risk of real-world harm is high’. And yet, we have seen a proliferation of anti-trans hate speech and disinformation platforms such as Facebook,” she said.

“We still have no information about what steps Meta has taken to limit the harms to trans and gender diverse people, particularly children and young people.”

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories