Australia’s eSafety Commissioner social media platform “X” (formerly known as Twitter) to remove graphic videos of the stabbing of Bishop Mar Mari Emmanuel in Sydney last week from the site. The incident was captured on the church’s livestreamed mass service.
In response to this order, X’s owner, Elon Musk, the commissioner the “Australian censorship commissar”.
X had agreed to part of the take-down. However, it did not agree with removing the material entirely, telling media publications “X believes that eSafety’s order was not within the scope of Australian law and we complied with the directive pending a legal challenge.”
So what are the laws around this, especially because the church incident was quickly labelled a by authorities? What powers do governments have in this situation?
Prompt political fallout
The response from politicians has been swift. Labor minister Tanya Plibersek Musk as an “egotistical billionaire”.
Senior Liberal Simon Birmingham :
They absolutely should be able to quickly and effectively remove content that’s damaging and devastating to the social harmony and fabric of society, particularly images such as terrorist attacks.
Other Labor ministers X as “a playground for criminals and cranks” or accused the company of thinking they’re above the law.
Of course such damning remarks directed towards a much-maligned website and its equally controversial owner are to be expected. What politicians can do about it is another matter.
What do federal laws say?
The eSafety Commissioner, Julie Inman-Grant, has the power to require the take-down of material under the Online Safety Act. The power she exercised under part nine of that act was to issue a “removal notice”. The removal notice requires a social media platform to take down material that would be refused classification under the Classification Act.
The video was circulating online as the New South Wales Commissioner of Police, Karen Webb announced the attack was a terrorist incident and the alleged perpetrator would be charged with a .
While it’s these laws being applied in the case against X, there are other laws that can come into play.
Australia also has a voluntary code of practice relating to . This is administered by the industry group . The signatories to this code include Adobe, Apple, Facebook, Google, Microsoft, Redbubble, TikTok, and Twitch.
X had previously adopted the code. X’s failure to comply led to its signatory status by DiGi in November 2023.
The government released a draft of a proposed to combat misinformation and disinformation in . The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill would give the Australian Communications and Media Authority power to enforce an industry code, or make one if the industry could not. It is a variation of this bill, reflecting the substantial range of views on the draft, that now has .
Would this new law make any difference in this case?
The immediate answer is no. The eSafety Commissioner already has extensive powers. She used only one of those powers in this case, but there are are alternative courses of action.
What else could be done?
Perhaps the gruesome images in the Wakeley videos might remind some of the Christchurch massacre.
In that attack, Telstra, Optus, and Vodafone (now part of TPG), to sites such as 4Chan, which were disseminating video of the attack. This was without any prompting from either the eSafety Commissioner or from law enforcement agencies.
The eSafety Commissioner has the power to require telcos to block access. She would need to be satisfied the material depicts abhorrent violent conduct and be satisfied the availability of the material online is likely to cause significant harm to the Australian community.
This means the commissioner could give a blocking notice to telcos which would have to block X for as long as the abhorrent material is available on the X platform.
Separately, the telcos have an obligation to do their best “to prevent telecommunications networks and facilities from being used in, or in relation to, the commission of offences against the laws of the Commonwealth or of the States and Territories” under the Telecommunications Act. This requires there to be an offence.
There is a potential that sharing the video material could be seen as an act done in preparation for, or planning, terrorist acts, if the video was depicting an incident police had decided was an act of terror. This would be a breach of the terrorism prohibitions under the federal Criminal Code.
All this is to say while Musk may be unhappy with the eSafety Commissioner’s actions, it’s just the tip of the iceberg of the laws that could force his site to remove terrorist content.