³Ô¹ÏÍøÕ¾

Instead of banning kids from online spaces, here’s what we should offer them instead

Banning children under 16 from social media . For overwhelmed parents in a digital age, this move from the Australian government may seem like welcome relief.

Author


  • Amanda Third

    Co-Director, Young and Resilient Research Centre/Professorial Research Fellow, Institute for Culture and Society, Western Sydney University

But evidence shows it’s highly in this country. Indeed, bans online.

Children and young people . Online spaces are one of the few avenues our have to interact freely with each other, which is crucial for their wellbeing.

A social media ban will close down this avenue and force children into lower-quality online environments. Children already say adults don’t understand what they do online and .

A blanket ban affirms parents “don’t get it”. Kids will find ways to get around the ban. And if their interactions turn sour on social media, the fact they were not supposed to be there will make it more difficult to reach out to adults for help.

Crucially, demands for blanket bans – challenging to implement – also force tech platforms into “compliance mode”. They divert company resources away from designing better online environments for children and .

What should we do instead of a ban?

Our children’s online safety is a . There are constructive steps we can take, but they need more cooperation between governments, industry, the community sector, parents, caregivers, educators, researchers, and children and young people themselves.

All children learn by taking risks and making mistakes. The focus needs to be on eliminating online harms, and equipping children and their caregivers to deal confidently with the digital world.

Tighter regulation is part of the solution. But making the internet a better place for children – not just banning them – is the very best protection we can provide.

So, what would that look like?

One way is to implement . Popularised internationally by the Australian eSafety Commissioner, safety by design is what it sounds like – baking safety features into the DNA of technological products and platforms.

Here, we should . They are urging platforms and governments to do several things:

  • give minors privacy by default
  • provide standardised, easily accessible and well-explained reporting processes across diverse platforms
  • use AI to detect bad actors attempting to interact with children.

Children also want to know what data is collected from them, how it is used, by whom, and for what purposes.

They’re also calling for safety-by-design features that eliminate sexual, violent and other age-inappropriate content from their feeds.

All of these steps would help to to take care of themselves and others online – like being cautious when interacting with people they don’t know, and not sharing personal information or images online.

Designing optimal online spaces for children in various age groups is more constructive than a ban.

Not just safe, but optimal

Safety by design is not the whole solution. Building on the efforts to develop , industry and government should come together to develop a wider range of standards that deliver not just safe, but optimal digital environments for children.

How? High-quality, child-centred evidence can help major platforms develop industry-wide standards that define what kinds of content are appropriate for children of different ages.

We also need targeted education for children that builds their and prepares them to deal with and grow through their engagement online.

For example, rather than education that focuses on extreme harms, children are calling for online safety education in schools and elsewhere that supports them to manage the low-level, everyday risks of harm they encounter online: disagreements with friends, inappropriate content or feeling excluded.

Heed the evidence

Some authoritative, already exists. It tells us how to ensure children can mitigate potential harms and maximise the benefits of the digital environment.

Where the evidence doesn’t yet exist, we need to invest in child-centred research. It’s the best method for gaining nuanced accounts of children’s digital practices, and can guide a coherent and strategic .

, we also need to better align evidence with decision-making processes. This means speeding up high-quality, robust research processes or finding ways for research to better anticipate and generate evidence around emerging challenges. This way, governments can weigh up the benefits and drawbacks of particular policy actions.

Technology is not beyond our control. Rather, we need to decide, together, what role we want technology to play in childhood.

We need to move beyond a protectionist focus and to build the very best digital environments we can imagine. Nothing short of the future is at stake in doing so.

The Conversation

/Courtesy of The Conversation. View in full .