³Ô¹ÏÍøÕ¾

The hidden cost of the AI boom: social and environmental exploitation

Mainstream conversations about artificial intelligence (AI) have been dominated by a few key concerns, such as whether superintelligent AI will , or whether AI will steal our jobs. But we’ve paid less attention the various other environmental and social impacts of our “consumption” of AI, which are arguably just as important.

Authors


  • Ascelin Gordon

    Senior research fellow, RMIT University


  • Afshin Jafari

    Research fellow, RMIT University


  • Carl Higgs

    Research Officer, Centre for Urban Research, RMIT University

Everything we consume has associated “” – the indirect impacts of our consumption. For instance, is a well-known externality that has a negative impact on people and the environment.

The online services we use daily also have externalities, but there seems to be a much lower level of public awareness of these. Given the massive uptake in the use of AI, these factors mustn’t be overlooked.

Environmental impacts of AI use

In 2019, French think tank The Shift Project estimated that the use of digital technologies produces more carbon emissions than the . And although AI is currently estimated to contribute less than 1% of total carbon emissions, the AI market size is predicted to .

Tools such as are built on advanced computational systems called large language models (LLMs). Although we access these models online, they are run and trained in physical data centres around the world that consume significant resources.

Last year, AI company Hugging Face published an of the carbon footprint of its own LLM called BLOOM (a model of similar complexity to OpenAI’s ).

Accounting for the impact of raw material extraction, manufacturing, training, deployment and end-of-life disposal, the model’s development and usage resulted in the equivalent of .

Hugging Face also estimated GPT-3’s life cycle would result in ten times greater emissions, since the data centres powering it run on a more carbon-intensive grid. This is without considering the raw material, manufacturing and disposal impacts associated with GTP-3.

OpenAI’s latest LLM offering, , is and potentially far greater energy usage.

Beyond this, running AI models requires large amounts of water. Data centres use water towers to cool the on-site servers where AI models are trained and deployed. Google recently for plans to build a new data centre in that would use 7.6 million litres of water each day to cool its servers, according to the nation’s Ministry of Environment (although the Minister for Industry has contested the figures). Water is also needed to generate electricity used to run data centres.

In a published this year, Pengfei Li and colleagues presented a methodology for gauging the water footprint of AI models. They did this in response to a lack of transparency in how companies evaluate the water footprint associated with using and training AI.

They estimate training GPT-3 required somewhere between 210,000 and 700,000 litres of water (the equivalent of that used to produce between 300 and 1,000 cars). For a conversation with 20 to 50 questions, ChatGPT was estimated to “drink” the equivalent of a 500 millilitre bottle of water.

Social impacts of AI use

LLMs often need extensive human input during the training phase. This is typically outsourced to independent contractors in low-income countries, leading to “digital sweatshop” criticisms.

In January, Time on how Kenyan workers contracted to label text data for ChatGPT’s “toxicity” detection were paid less than US$2 per hour while being exposed to explicit and traumatic content.

LLMs can also be used to generate . Left unchecked, AI has the potential to be used to manipulate public opinion, and by extension could undermine . In a , researchers at Stanford University found AI-generated messages were consistently persuasive to human readers on topical issues such as carbon taxes and banning assault weapons.

Not everyone will be able to adapt to the AI boom. The large-scale adoption of AI has the potential to worsen global . It will not only cause significant – but could particularly marginalise workers from certain backgrounds and in .

Are there solutions?

The way AI impacts us over time will depend on myriad factors. Future generative AI models could be designed to use , but it’s hard to say whether .

When it comes to data centres, the location of the centres, the type of power generation they use, and the time of day they are used can significantly impact their overall and consumption. Optimising these computing resources could result in significant reductions. Companies including , and have championed the role their AI and cloud services can play in managing resource usage to achieve efficiency gains.

Also, as direct or indirect consumers of AI services, it’s important we’re all aware that every chatbot query and image generation results in water and energy use, and could have implications for human labour.

AI’s growing popularity might eventually trigger the development of . These would help users understand and compare the impacts of specific AI services, allowing them to choose those which have been certified. This would be similar to the , wherein European data centre operators have agreed to make data centres climate neutral by 2030.

Governments will also play a part. The European Parliament has approved draft legislation to mitigate the risks of AI usage. And earlier this year, the US senate heard testimonies from a range of experts on how AI might be effectively regulated and its harms minimised. China has also on the use of generative AI, requiring security assessments for products offering services to the public.

The Conversation

Ascelin Gordon is employed by RMIT University. He receives funding support from the Australian Research Council, the NSW Department of Planning and Environment, and the NSW Biodiversity Conservation Trust.

Afshin Jafari is employed by RMIT University.

Carl Higgs is employed at RMIT University and receives funding support from ³Ô¹ÏÍøÕ¾ Health and Medical Research Council grants.

/Courtesy of The Conversation. View in full .