Digital rights groups have blasted Facebook’s latest commitment to clean up the federal election as a “bandaid” measure framed in a way to avoid more scrutiny of the company’s fundamental misinformation issues.
Facebook parent company Meta on Tuesday announced plans to increase third-party fact checkers and public awareness as part of its most “comprehensive” Australian election protections.
The commitments include a new third party fact checking partnership with RMIT, with the Melbourne university to join Agence France Presse and the Australian Associated Press in reviewing and rating the accuracy of content.
Content determined to be false will not necessarily be removed but Facebook said it will “significantly reduce its distribution” and alert users that share it.
The social media giant will also partner on awareness campaigns to help users spot fake news and understand the dangers of sharing it. Meta also promised real time data on election communication and continued transparency on political advertising campaigns.
The scandal plagued company is attempting to get on the front foot ahead of election campaigns likely to have unprecedented online activity, according to Reset Australia tech policy director Dhakshayini Sooriyakumaran.
She said Meta is framing the misinformation debate around fact checking and user awareness, without acknowledging the underlying business model and opaque algorithms that amplify dangerous content.
“They’ve really managed the narrative well, but these measures are wholly inadequate in terms of actually protecting Australian voters for the upcoming election,” Ms Sooriyakumaran told InnovationAus.
Fact checkers are critical, she said, but only part of the much bigger response needed to combat misinformation and disinformation online.
“Once the content is served to Australian voters and once it’s gone viral, then you’ve really got limited impact in terms of reducing the harm. So preventative measures are really crucial,” she said.
“And [Meta] haven’t really laid out a lot in terms of what the platform itself is doing to prevent to prevent these harms.”
Analysis by Reset Australia’s global affiliate organisations found Facebook’s most effective response to misinformation and disinformation was when the company adapted its algorithms to reduce the distribution of sensational material while prioritising authoritative content.
Known as one of the company’s “break glass” measures, it was deployed during post-2020 US election unrest amid high social tension and erroneous claims Joe Biden had “stolen” the presidency.
So far, that ideas hasn’t been publicly floated in Australia, a symptom of the small market not warranting the same investment as the US, according to a Facebook whistleblower.
“There’s this kind of blurring between what’s going on globally and what’s going on in Australia and [Meta is] trying to erase or evade this issue of investment in the Australian context,” Ms Sooriyakumaran said.
The Australia Institute thinktank is also sceptical of the plan, and agrees it does not address the root causes of Facebook’s content problems.
“While investing in fact checking and extra moderation capabilities a good thing, you can’t really fact check your way out of a toxic business model,” Australia Institute Centre for Responsible Technology research fellow Jordan Guiao told InnovationAus.com.
“If your business model prizes outrage and provocative content, then the lies and disinformation will always surface over the more neutral, fact based journalistic content.”
Mr Guiao said the latest election commitment was another “reactive” initiative that will have almost no material impact on stemming misinformation and disinformation during the election campaign.
“[Meta] create bandaid solutions, and it’s never enough because the central issue is their platform,” he said.
Mr Guiao said “regulation parity” of online platforms where the same election blackout periods are applied and election regulators have more oversight, and truth in advertising laws would be more effective solutions.
“It should be mandatory to have truthful and accurate content in political ads, particularly during election times, nationwide,” Mr Guiao said.
Ms Sooriyakumaran also called for better regulation of platforms, particularly “upstream” to address algorithms and user safety measures, as is being introduced in Europe with the Digital Services Act.
“I think the Digital Services Act where they’re looking to actually regulate algorithms, have data sharing mandates to independent regulators and researchers and civil society, [and] actually have real penalties for non-compliance — that is the ultimate kind of solution to these challenges,” she said.
“Because we know that platforms can’t self-regulate.”
Do you know more? Contact James Riley via Email.