³Ô¹ÏÍøÕ¾

Industry standards to tackle worst-of-the-worst online content a key step closer

eSafety Commissioner

Today, two industry standards developed by the eSafety Commissioner will be registered, requiring a range of online services to do more to tackle child sexual abuse and pro-terror material on their platforms.

The standards cover Designated Internet Services (DIS), like apps, websites, file and photo storage services, and some services which deploy or distribute generative AI models; and Relevant Electronic Services (RES), including dating sites, online games and messaging services.

The standards include requirements for some generative AI services where they have not incorporated controls to reduce the risk of generating high impact material like synthetic child sexual exploitation material.

This may include some apps that generate pornography or ‘nudify’ images without effective controls to prevent their application to children. The standards will come into force six months after registration once a 15-day parliamentary disallowance period has expired.

eSafety Commissioner Julie Inman Grant said the registration of the standards was an important step in providing greater protection for all Australians from the worst-of-the-worst online content.

“These standards will be enforceable and require industry to take meaningful steps to prevent their platforms and services from being used to solicit, generate, store and distribute the most abhorrent and harmful online material imaginable, including child sexual abuse.

“We know cloud-based file and photo storage services as well as many messaging services serve as a free haven for paedophiles to host, store and distribute child sexual exploitation material and companies who own and operate them must take responsibility for this misuse and act to disrupt and deter it.

“That’s what these two standards and the six other codes already in operation covering social media, search engines, app stores, internet service providers, device manufacturers and hosting services are designed to force them to do.”

eSafety moved to the development of standards after DIS and RES codes submitted by industry back in March 2023 were refused registration by eSafety for failing to provide appropriate community safeguards in relation to this highest harm class 1 content.

In November 2023, eSafety invited submissions from industry, other stakeholders and the public on draft standards. eSafety has carefully considered all the feedback in finalising the standards.

This includes giving greater clarity to operators of end-to-end encrypted services, by expressly stating in the standards that they are not required to break or weaken encryption.

eSafety also recognises these standards will apply to broad industry categories covering a range of services, and that they will require differing approaches to detecting and removing illegal content such as child sexual abuse material.

To that end, no specific technologies or methods are prescribed in the standards. The standards also include carefully calibrated exceptions where measures are not technically feasible or reasonably practicable.

Where exceptions are applied, the standards still require providers to take appropriate alternative action. eSafety can require service providers to provide information about cases where these exceptions are relied upon and describe and justify the alternative action taken.

“We understand different services may require different interventions but the clear message of these standards is that no company or service can simply absolve themselves from responsibility for clear and tangible actions in combatting child sexual abuse and pro-terror material on their services,” Ms Inman Grant said.

eSafety will develop and release regulatory guidance in the coming months to support compliance. In the event of non-compliance with a standard, eSafety can issue a formal warning, issue an infringement notice, accept an enforceable undertaking, or seek an injunction or civil penalty in court.

The two industry standards registered today – and the six other industry codes which are already in effect – can be found .

Online service providers should also be implementing the , which were recently amended to better address new and emerging online safety issues.

In addition, a new voluntary industry code for online dating services with a focus on user safety is expected to be released soon.

Alongside these developments, the Government has brought forward the independent statutory review of the Online Safety Act – led by Ms Delia Rickard PSM.

The review will consider whether any changes are needed to make sure Australia’s online safety laws remain fit for purpose.

This work complements the Government’s broader work to ensure AI is developed and used safely and responsibly in Australia, as well as ongoing reform following the Privacy Act Review.

/Public Release.