The secret environmental cost to using AI

There has been lots of controversy around OpenAI and ChatGPT in the past few months, from ethical concerns, to financial issues, to clashing visions amongst OpenAI’s leadership. From the climate perspective, there are also valid concerns about Chat-GPT’s greenhouse gas emissions and water consumption.

Chat-GPT is an LLM, a large language model. This means its system is trained to predict the likelihood of a particular ‘token’—be that a character, word or phrase—based on the preceding prompt and context. ChatGPT’s corpus is 45 TB of data. For context, a single terabyte is the equivalent of 130,000 digital photos. According to this article from the UN University, it is predicted that this training process, due to its enormous scale, produces 500 tons of CO2, which is the equivalent of over 500 flights between New York and London. 

A misconception of the ‘cloud’ is that the data exists out there in the ether, but in fact it exists within physical space, in huge data centres running thousands of GPUs. The power source for these data centres greatly impacts emissions of course. According to Chat-GPT, they are seeking alternative power sources, and OpenAI’s CEO Sam Altman has significant personal investments in Helion Energy, the projected ‘world’s first [nuclear] fusion power plant’. Regardless of investments and controversies around fusion energy, Chat-GPT is consuming non-renewable power every day at a staggering rate.

Of equal significance is the consumption of water by OpenAI’s data centres. Its servers transfer most of their waste energy to heat as they run, so they are continually cooled by vast quantities of water. Cool water is pumped along the servers, which once heated, gets recycled through a heat exchanger. Some water is lost through evaporation each time through a cooling tower, and fresh water can only be used a handful of times to prevent mineral and salt build up. Between evaporation loss and discharge, a constant fresh supply of potable water (inhibiting pipe clogs and bacterial growth) is required, totaling an estimated 500ml of water per 20-50 question conversation! Considering the millions of users each day, this quantity rapidly adds up, especially in a context where vital fresh water supplies are rapidly dwindling on our planet. Considerations of using solar energy to power these data centres may impact water use further, as high insolation and high temperatures go together, potentially increasing water-cooling demand. 

It is important that as fluent and active members of this digital age and proponents of AI’s potential to do good and creative innovation, for the climate and beyond, to remain critical and to bring the climate cause to the proverbial table. The point is not to stop using AI, but to reflect on the challenges as well as opportunities for the planet in the AI era . And perhaps limiting your reliance and using less wordy prompts while you’re at it to reduce your water footprint by up to a couple of litres a day! 

Written by Kez Whittlesea

Previous
Previous

Waste not, want not: recycling, value chains, and value change

Next
Next

New Years Resolutions Meet Climate Revolution