The former Facebook worker who blew the whistle on the company’s disregard for user safety last year is “deeply sceptical” of Australia’s co-regulation approach to online platforms, advocating instead for a model independent of tech giants’ input that forces transparency.
American data scientist Frances Haugen disclosed tens of thousands of Facebook’s internal documents to US regulators last year, alleging the company was putting profits and growth ahead of user safety.
On Thursday she appeared before the Parliament’s current online safety inquiry, detailing the company’s inner workings and warning against Australia’s co-regulation model, saying it would do little to curb the amplification of harmful content.
“If we allow Facebook to write its own regulations, if we allow them to operate in the dark, they will continue to mislead us and under invest in the most basic safety systems,” Ms Haugen told the inquiry.
Australia has recently introduced several reforms to promote online safety, with many relying on industry codes and minimum standards to remove misinformation and other harmful content, while the online regulator focuses on content takedowns.
The biggest platform companies have told the same inquiry they are making progress on content removal and user safety, pointing to their own reports and automated tools, and plans to give users more control over newsfeed rankings.
But the self-reporting is routinely and deliberately misleading, Ms Haugen said, warning companies will not make significant progress until they are forced to disclose how they amplify content and protect users, including making the data available to independent researchers.
“Platforms cannot be trusted to act in the public interest. They are often — as my revelations showed — fully aware of the harms caused by their products and services, and yet choose to ignore these in favour of growth and profit,” Ms Haugen said.
She told the inquiry she had serious doubts about the impact of Australia’s recent regulatory interventions, which focus on content takedowns, unmasking anonymous users, and minimum safety expectations, many of which companies like Facebook helped write.
“I am deeply, deeply sceptical that you will get anywhere close to what you need if you allow them to write the rules. And the reason for that is just pure power imbalance,” Ms Haugen said.
“Facebook knows that the people who best understand these systems are inside the company and that [Facebook] are incredibly good at selectively disclosing information to mislead people…they don’t want you to see what the actual patterns are.”
According to Ms Haugen, Facebook’s algorithms favour “extreme” posts that illicit reactions from other users, without proper regard to the content or potential harms of spreading it. The company can effectively turn down the amplification using a “break glass” measure and has done so during times of high tension like high profile court decisions.
The reason the social media giant does not turn this down more often or permanently is because it would increase error rates of stopping harmless content, opening it up to allegations of censorship and hurting the its business model of keeping people engaged and online longer to sell more ads.
“In aggregate, you catch tons more hate speech or tons more graphic violence — lots and lots more. But you have a higher error rate. I really believe that Facebook should disclose all the AI systems that it uses to control our content and it should have to release random samples,” Ms Haugen said.
Ms Haugen appeared as Facebook released disappointing financial results in the US, sending shares down 20 per cent and wiping hundreds of billions of dollars of its market value, and as mining magnate Andrew ‘Twiggy’ Forrest launches criminal action against the company in Australia.
In Q4 results released overnight, Facebook reported disappointing earnings, gave weak guidance, and said user growth has stagnated.
The tech giant is also gearing up for a fight with Australian billionaire mining magnate Andrew “Twiggy” Forrest, who revealed on Thursday he has launched criminal proceedings against Facebook, alleging it breached Australia’s money laundering laws by failing to prevent false cryptocurrency advertisements.
Mr Forrest said Facebook had been “criminally reckless” by failing to take down the ads featuring his image, which he flagged in 2019, and his stance was also for “everyday Australians” to protect their savings from being “swindled away by scammers”.
“I’m concerned about innocent Australians being scammed through clickbait advertising on social media,” Mr Forrest said.
“I’m committed to ensuring that social media operators don’t allow their sites to be used by criminal syndicates.”
Ms Haugen first warned Australian MPs of the risk of unchecked platforms in October after blowing the whistle on Facebook for putting “profits before people”. Her appearance then was supported by digital rights group Reset Australia.
Reset Australia continues to call for better oversight of the companies, telling the current inquiry the platforms “have wreaked havoc on our public square, leaving people facing a myriad of risks from algorithmic bias to harmful content”.
The group told the inquiry Australia needs to move beyond the “whack-a-mole” approach of content moderation to regulation which addresses the systemic risks of platforms by introducing duties of care, risk assessment requirements, and liabilities for harm resulting from the algorithmically promoted content.
Do you know more? Contact James Riley via Email.