Microsoft acts on extremist content


Denham Sadler
National Affairs Editor

Microsoft has called for the tech sector to put aside rivalries and work together to better moderate and remove extremist content in the wake of the Christchurch terror attack.

In a blog post, Microsoft president Brad Smith said the tech sector needs to “learn and act”, and better cooperate in order to fix many of the issues that were on display during and after the attack in New Zealand, where 50 people were killed in Christchurch mosques by a right-wing extremist terrorist.

Tech giants like Microsoft, Facebook, Google and Twitter should share the automated tools they use to remove extremist content, and work together from a “joint virtual command centre” during major incidents.

Mr Smith also said that while technology contributed to many of these issues, it can also be the driving force in working to solve them.

“Words alone are not enough. Across the tech sector, we need to do more. Especially for those of us who operate social networks or digital communications tools or platforms that were used to amplify the violence, it’s clear that we need to learn from and take new action based on what happened in Christchurch,” Mr Smith wrote.

“We all need to come together and move faster. This is the type of serious challenge that requires broad discussion and collaboration with people in governments and across civil society around the world. We hope this will become a moment that brings together leaders from across the tech sector.”

Microsoft has by no means been one of the tech companies in the spotlight following the terrorist attack. The terrorist’s video was livestreamed to Facebook and widely posted across the platform, along with YouTube, Twitter and other social media and video-hosting sites.

“It’s sometimes easy amidst controversy for those not in the hot seat to remain silent and on the sideline. But we believe this would be a mistake. Across the tech sector we can all contribute ideas, innovate together and help develop more effective approaches. The question is not just what technology did to exacerbate this problem, but what technology and tech companies can do to help solve it. Put in these terms, there is room – and a need – for everyone to help,” Mr Smith said.

Mr Smith said that Microsoft employees and the company’s technology tools worked quickly to stop the distribution of the video, but much room for improvement has been identified. The tech giant will now be accelerating the implementation of existing tools to identify and classify extremist violent content, and making changes to the process that allows users to report this content.

But each tech company making similar changes independently will not be enough, he said, and the big firms need to work together to make a real change.

“This is an area in which companies across the tech sector need to learn, think, work and act together. Competition is obviously indispensable to a vibrant technology sector but when it comes to saving human lives and protecting human lives, we should act in a united way and enable every company large and small to move faster,” Mr Smith said.

“Ultimately we need to develop an industry-wide approach that will be principled, comprehensive and effective. The best way to pursue this to take new and concrete steps quickly in ways that build upon what already exists.”

While the big tech companies already share a database of known terrorist content and have photo, video and text-matching tools to identify it, much more cooperation is needed, according to Microsoft.

This should involve the sharing of open-source technology, including artificial intelligence, that is used for identifying and deleting extremist content.

“These technologies can enable us more granularly to improve the ability to remove violent video content. We should pursue all these steps with a community spirit that will share our learning and technology across the industry through open source and other collaborative mechanisms. This is the only way for the tech sector as a whole to do what will be required to be more effective,” Mr Smith said.

Major tech companies should also have a “major event” protocol where they would work from a “joint virtual command centre” during an incident such as Christchurch.

“This would enable all of us to share information more quickly and directly, helping each platform and service to move more proactively, while simultaneously ensuring that we avoid restricting communications that are in the public interest, such as reporting from news organisations,” Mr Smith said.

There could also be a category of agreed “confirmed events” where the tech companies would “jointly institute additional processes to detect and prevent sharing of these types of extremist violent content”.

The responsibility is now on these big tech companies to work actively to prevent the uploading and distribution of extremist content before a major incident, Mr Smith said.

“The industry rightly will be judged not only by what it prevented, but by what it failed to stop. And from this perspective, there is clearly much more that needs to be done. As Prime Minister Jacinda Ardern noted last week, gone are the days when tech companies can think of their platforms akin to a postal service without regard to the responsibilities embraced by other content publishers,” he said.

“Even if the law in some countries gives digital platforms an exemption from decency requirements, the public rightly expects tech companies to apply a higher standard.”

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories