Tech giants need to “step up” and do more to combat the “significant disconnect” between the expectations of the community and what they actually deliver, Communications Minister Paul Fletcher has told the National Press Club.
Mr Fletcher heaped more pressure on Facebook and other social media companies to do more to stop the spread of harmful material.
The minister opened a 10-week consultation period on a new Online Safety Act, with proposed changes that include reducing the amount of time platforms have to take down “harmful content” to one day, a new end user take-down and civil penalty regime, and the extension of cyberbullying schemes to gaming message services.
The new Act follows a number of moves from the Coalition aimed to forcing global tech companies to do more to remove content like this, including controversial “abhorrent violent material” legislation and a new site-blocking regime.
In the speech, Mr Fletcher said the tech companies aren’t doing enough to protect Australians online.
“There continues to be a significant disconnect between the expectations of Australians and what is delivered by the internet industry today,” Mr Fletcher said.
“A key manifestation of that disconnect is that many of today’s most popular digital products and services have not been designed with user safety in mind.”
“That needs to change. Harmful material must be taken down faster. Attempts to send terrorist attacks viral must be stopped in their tracks. Industry needs to step up and take more responsibility. We are putting the pressure on and keeping the pressure on.”
The new Online Safety Act consolidates existing online safety regulatory requirements and introduces new laws and protections, he said.
Among the proposed changes are a decrease to the time social media companies have to take down “harmful content” under the existing cyberbullying and image-based abuse schemes from 48 hours to 24 hours.
The government also wants to extend the application of the current cyberbullying scheme for children to cover other services, such as gaming chats, messaging apps and other social media-connected services.
The Act also proposes the introduction of a new cyber abuse scheme for adults which would facilitate the removal of “serious online abuse and harassment” through a new end-user take-down and civil penalty regime.
It planned to work with search engines and app stores to de-list offending content too.
The eSafety Commissioner would be handed further responsibilities under the proposed legislative powers, with new powers to force platforms to transparently report on these issues.
Mr Fletcher also unveiled a new Online Safety Charter which sets out the government’s expectations of these tech giants, based around behaviour that is unacceptable in the physical world also being unacceptable online.
The Charter states that tech companies have a responsibility to “mitigate and address” any adverse impacts directly or indirectly associated with their products and services.
Mr Fletcher said the message to global tech companies is simple: “they have the opportunity to step up and meet Australia’s expectations when it comes to preventing online harms.”
The Coalition has attempted to pile pressure on these social media companies this year to do more to combat the spread of harmful and violent content. The efforts began following the Christchurch terrorist attack, part of which was livestreamed on Facebook.
In the aftermath of the terrorist attack, the government announced new legislation which introduced new fines and potential jail sentences for executives of tech companies that don’t take down “abhorrent violent material” quickly enough.
The legislation was quickly passed by Parliament in the lead-up to the May election, despite Labor saying it had “serious concerns” with the powers. There have been concerns that the new powers will have unintended consequences for the local tech sector, and won’t actually be effective in removing harmful content.
The government also launched new site-blocking regulations allowing telcos to block websites hosting “harmful and extreme content” during and after a “crisis event”.
These new rules were recently put into effect, with the eSafety Commissioner ordering that eight websites be blocked because they were hosting terrorist-related content.