³Ô¹ÏÍøÕ¾

Reset: YouTube ban on anti-vax content is too little, too late

Reset Australia

YouTube’s decision to ban anti-vax content will do little to tackle the systemic problem of misinformation, says Reset Australia as it calls for public oversight and algorithmic audits to uncover how Big Tech profits from amplifying false and misleading content.

“Content moderation is a giant game of whack-a-mole – ultimately it’s futile because there will always be new content popping up where you’re not looking,” said Chris Cooper, executive director of Reset Australia, the Australian arm of the global initiative working to counter digital threats to society.

“If YouTube is serious about tackling misinformation it needs to be transparent about how its algorithms are amplifying this content to viewers.”

YouTube has consistently been opaque about the nature and extent to which its recommendation algorithms are leading people to misinformation or the true scope of anti-vaccine conspiracies.

“Big Tech’s timid attempts at self-regulation, like labelling posts as fake or de-platforming individual spreaders have as much impact as an oil company planting a thousand trees to counter climate change. The problem with all these downstream interventions is they don’t tackle the core systemic issue of unchecked algorithms.”

Recommendation algorithms are used across Big Tech platforms to keep users online for longer and served more advertisements. Algorithms prioritise the most engaging content, but increasingly research is showing this also happens to be the most emotive, conspiratorial, and enraging content.

“YouTube’s algorithms prioritise content for its engagement value, rather than its accuracy.

“So while social media didn’t invent conspiracy theories – its unchecked algorithms have supercharged them into global movements.

“Even in a country like Australia, with a 95% childhood immunisation rate, COVID-19 vaccine misinformation has led to a high degree of hesitancy.

“Much of this anti-vax content can be found on YouTube, despite the fact that YouTube’s policy states that it doesn’t allow content about COVID-19 that contradicts health authorities. Many viewers have likely had conspiracies recommended to them by an algorithm, rather than seeking it out themselves.”

Mr Cooper said blanket bans on content also raised questions about who controls information online.

“The other question is how comfortable we are with a private international organisation making hugely influential decisions about what information is allowed and what isn’t? Without any public oversight there’s no real power for consumers to engage meaningfully in content moderation efforts.”

/Public Release.