A new AI-driven immersive visualisation system will help us address the new generation of extreme fires caused by global warming.
An immersive visualisation platform that virtually recreates the experience of being in a wildfire will help artists, designers, firefighters and scientists better understand and communicate the dynamics of these extreme events.
, based at the at , is a sector-first Artificially Intelligent (AI) immersive environment that visualises the unpredictable behaviour of wildfires. It gives users and researchers a visceral understanding of the dynamics of wildfires, at 1:1 scale and in real time, within a safe virtual environment.
The five-year project is funded by the ARC Laureate Fellowship of lead researcher , Director of the iCinema Research Centre, . The philosopher-turned-artist uses artistic simulation to sensorially explore diverse risk-laden scenarios, directly addressing issues like global warming in visceral and compelling ways.
“Wildfires are a whole new generation of fires,” says Prof. Del Favero. “We’re experiencing accelerating levels of global warming which are leading to fires of a scale, speed and violence never before seen in recorded human history.”
“It uses real-world data to visualise not only what they look like, but also what they feel [and sound] like. Sound is very important … [because] wildfires have a particular acoustic that is entirely unique.”
The project, like the Centre, is interdisciplinary in approach, working across art, design, computing and science. It brings together global experts in fire research, including computer and fire scientists at UNSW such as Professor Maurice Pagnucco and Professor Jason Sharples, Data 61, the University of Melbourne, San José State University and more than 15 international industry and government partners, including the Australasian Fire & Emergency Service Authority Council, Fire Rescue NSW, CALfire, Pau Costa Foundation and the ARC Centre of Excellence in Climate Extremes.
Arming fire-vulnerable areas
Unlike traditional bushfires that move relatively predictably, wildfires are fundamentally unpredictable. They can form their own weather systems generating lightning storms that can ignite new fires; this, in addition to their size and speed, makes their behaviour difficult to anticipate, Prof. Del Favero says.
“Situational awareness is critical in a wildfire… It’s a bit like being in a combat zone. You don’t know where the dangers are. They can surround you and be above you,” he says. “So, we’re developing a way of visualising this type of dynamic by using artificial intelligence to drive the visualisation so that the fire behaves unpredictably according to its own logic, not according to our expectations.”
The platform will provide a tool for two distinct users.
For fire scientists, firefighters and fire organisations around Australia and internationally, it will facilitate research and training in the dynamics of wildfire scenarios enabling open-ended decision-making for a more agile and collaborative approach to fire planning, group training, and fire management.
It will also enable artists, curators and designers to imaginatively explore wildfire landscapes using a digital palette with a vast range of atmospheres, flora and topographies to enhance public engagement and understanding of these scenarios.
Users can share and explore the environment across multiple locations and platforms, including mobile 360-degree 3D cinemas as well as more portable 3D projection screens, 3D head-mounted displays to laptops and tablets.
In addition to use in fire science and the arts, the iFire project will develop a geo-specific software application as part of its resource toolkit. The application will be able to be downloaded in fire-vulnerable areas for use by fire researchers, first responders and the community.
“Local councils can apply it to their own geographic precinct to show people how wildfires could move into their community. It would become part of their portfolio of educational tools for fire preparedness,” Prof. Del Favero says.
The project will also develop a pipeline for sharing and integrating diverse data sets – of fire behaviours, management procedures and protocols, for example – collected by a range of agencies to facilitate research into wildfires, he says.
“It will set benchmarks for how we use this data to effectively visualise these events.”
Read more: n
AI a powerful research partner
Harnessing artificial intelligence is integral to understanding these data sets.
“AI optimises our ability to experience the dynamics of fire in the landscape,” Prof. Del Favero says. “It can help us process all this complex data more rapidly and in more insightful ways than what we [as humans] can do.
“And we really need help at the moment as extreme events such as wildfires are existential, beyond our imagination in terms of effect and difficult to model.”
The project will also explore wildfire landscapes through a range of creative applications for film, museums and the contemporary arts.
“[AI-driven immersive visualisations] allow you to imagine whole new creative worlds that you wouldn’t otherwise be able to simply with human cognition alone,” he says.
The iFire platform will be developed for more niche industry needs, potentially commercial in nature. For example, Data 61 will work with UNSW iCinema to create an immersive experience of their fire application, Spark, which models bushfire spread to help plan for and manage bushfires.
Read more:
Visual technologies beneficial across disciplines
Prof. Del Favero says these kinds of advanced art and technology frameworks are applicable to a diverse set of needs.
The UNSW iCinema’s research spans interactive art scenarios, intelligent database systems, immersive design modelling and extreme event simulation. Previous projects have contributed to , , , and .
The , for example, delivered a suite of virtual reality simulations for China’s leading research and training institute for mine safety, the Shenyang Research Institute of China Technology & Engineering Group.
The project, later commercialised, created highly realistic simulations of an underground mine that allowed up to 30 trainees to simultaneously interact with hazard and technology scenarios. The immersive modules provided a highly effective alternative to training via lengthy manuals, training more than 30,000 miners and reducing fatalities and serious injury in the mining industries in China and Australia.
Artistic technologies that provide life-like experiences can help us better understand and address the unpredictable and turbulent scenarios that characterise the terrestrial changes we are experiencing, Prof. Del Favero says.
“I’m very interested in creating virtual worlds to enhance the way we engage with the physical world around us,” he says.
“Creating simulated worlds is a way of collaborating with an artificially intelligent twin to form a new type of partnership that integrates the speed and scale of AI in establishing patterns and predicting behaviours with the subtlety and adaptability of human situational understanding and decision making.”