Australia not equipped to handle deepfake electoral threat

Deepfakes present a growing threat to the integrity of Australia’s electoral system, according to the head of the electoral commission, who has warned that the agency is not equipped to tackle AI-generated misinformation.

Appearing before a Senate inquiry into AI on Monday, electoral commissioner Tom Rogers said “significant and widespread examples of deceptive AI content” had already emerged in overseas elections in 2024, including Indonesia, India and South Korea.

Security advice from Microsoft and other sources also suggests that some “nation states are adding AI to their toolbox of tricks to attempt to confuse electors in specific countries” as the super election year continues, he said.

But despite the rise of deepfakes, Mr Rogers said his agency’s “legislative toolkit is very constrained” to deal with AI-generated content in Australia, and that this is unlikely to change before next year’s federal election.

A person casts a vote into the ballot box during elections

“The Australian Electoral Commission does not possess the legislative tools, or internal technical capability, to deter, detect or adequately deal with false AI generated content concerning the election process,” he said.

Under the existing legislation, the use of AI to mispresent policy or political figures in political advertising is not unlawful under the Commonwealth Electoral Act as long as it is authorised by a political party, he said.

According to the AEC’s submission to the inquiry, AI-generated electoral content could be an offence under section 329, but without any watermarking requirements, the agency has to rely on others to identity problematic content.

“Of course I’d prefer no misleading information, but under the legislation, if it’s authorised, it’s currently lawful and that would be a matter for Parliament to change,” the commissioner told the inquiry.

Mr Rogers said the AEC’s worry centres on “deliberate misinformation generated by AI”, not content that has been generated by political parties using AI in the lead up to elections.

“What we’re concerned about is AI that misleads citizens about the act of voting. Where to vote, when to vote, how to cast a formal vote, the fact that that voting process is secure. That’s something that we’re very worried about,” he said.

“The other things, the truth of political statement, I think, either needs to be lodged somewhere else – may I say, anywhere else – or there must be other tools to combat it.”

For that reason, Mr Rogers rejected suggestions that a “blanket ban” on all AI-related activity at elections was necessary”, saying to would be “very hard to enforce” and “very impractical”.

But “changes, particularly potentially some legislative changes” could deal with the threats presented by AI, he said, including the mandatory watermarking of electoral content that is created using such tools.

In South Korea, the government passed laws in the lead up to its election earlier this year to crimialise the use of political campaign videos using AI-generated deepfakes. Despite the ban, as many as 388 pieces of content were still published.

The government could also introduce “voluntary codes of conduct to candidates and political parties to… be lawful during election campaigns” as India and Canada have done.

Other options on the table include “voluntary and mandatory Codes of Practice for technology companies”, with Mr Rogers noting that social media companies have become more reluctant to remove content since the 2022 election.

“The landscape for social media companies has changed dramatically over the last couple of years since the 2022 election. By and large, there’s been a movement within the social media organisations that moderation occurs in the public square.”

“We’ve noticed less willingness from the social media companies to remove content that we think, in some cases, may breach their own standards or … is harmful to our staff.

“We’ve had issues where there’s been threats online that previously would have had removed, they’re less likely to remove those. That’s been a development over the last two years.”

Mr Rogers said the AEC is currently examining the issues and is planning to provide advice to government, although legislation is “unlikely” to be in place before the next election.

Do you know more? Contact James Riley via Email.

Leave a Comment