From late January, Australia’s online safety regulator – the eSafety Commissioner – will begin operating a reporting scheme for adult cyber abuse, part of the new Online Safety Act. This will give Australian adults who are the victims of seriously harmful online abuse somewhere to turn, if the online service providers have failed to act on reports to them.
Today, eSafety has released detailed regulatory guidance for this scheme, to give an overview of actions the eSafety Commissioner can take to address serious online abuse reported to us.
“This is a world-first scheme. From the end of January we will be able to act as a safety net to give Australian adults who have been subjected to serious online abuse somewhere to turn if the online service providers have failed to act in removing the abusive content,” said eSafety Commissioner Julie Inman Grant.
Minister for Communications, Urban Infrastructure, Cities and the Arts, the Hon. Paul Fletcher MP said: “In 2015 we led the way by establishing the eSafety Commissioner with strong powers to respond to the cyberbullying of children, and now we’re expanding those powers to make sure all Australians can access support when things go wrong online.”
Commissioner Inman Grant explains how the scheme will work: “If a platform fails to take action, people can come to us to make a report. Our new investigative and information gathering powers will allow us to investigate and assess complaints, and decide what action we can take.
“This ground-breaking scheme gives us the ability to help those Australian adults who have been subject to the worst types of online abuse, which is becoming an all-too-common occurrence. If a report meets the threshold, we can issue a notice to the platform to get that harmful content removed.”
If eSafety issues a notice to remove the harmful content, the platform then has 24 hours to comply. The eSafety Commissioner will have the ability to seek significant civil penalties for failure to comply with a notice to remove abusive material.
“The bar for determining what ‘adult cyber abuse’ is has been set deliberately high, to ensure it does not stifle freedom of speech. We are talking here about the most serious of abusive posts, intended to cause serious psychological or physical harm,” said Ms Inman Grant
Under the law, to reach the threshold the abuse must be both ‘intended to cause serious harm’, and ‘menacing, harassing or offensive in all the circumstances’.
“Serious harm could include material which sets out realistic threats, places people in real danger, is excessively malicious or is unrelenting.”
Somebody finding something offensive or disagreeable would not be enough, the content must also be intended to cause serious harm to that individual. The scheme is not intended to regulate hurt feelings, purely reputational damage, bad online reviews, strong opinions or banter.
“There may be some circumstances where we can’t take regulatory action. Every situation is unique and every matter reported to us will be considered on a case-by-case basis. Even if a matter does not meet the threshold, we will still be able to offer support, information and advice,” said the Commissioner.
“eSafety can only act on adult cyber abuse reports made to us, we will not be proactively policing the internet, nor will we be content moderators looking at determining the truth of claims posted online. This new scheme is about helping those who are suffering from the worst types of online abuse.”
eSafety’s new scheme does not cover defamation. Defamation is a civil action, determined by Courts, designed to balance the right of freedom of speech with protecting a person’s reputation against harm. The Online Safety Act, and the new Adult Cyber Abuse Scheme, is about harm minimisation by removing serious and targeted online abuse, defamation laws are about compensation for damage caused to reputations.
Commencement of the Online Safety Act and the Adult Cyber Abuse Scheme is on Sunday 23 January 2022.