The Environmental Cost of AI: Unveiling the Energy Consumption of ChatGPT
But how much energy actually does this use? How many data centres will be needed to keep a continuously expanding, conversational sphere of sentience alive? When will we be able to slow down enough to do a comprehensive audit of Patrick Tresset’s Digital Twin, if at all? Currently, ChatGPT’s owners are being cagey about its energy consumption, along with most generative models. Another AI company, Midjourney, adopted the aesthetics and language of manifesto-writing surrealist artists to explain this baffling intransigence: ‘AI models are incredibly clever and powerful.
They don’t provide you with specific fed data or parameters. This is because the most effective machine learning methods involve black-box methods that hide the underlying workings. It’s an impenetrable magic trick. Sorry about that, we have to keep certain aspects secret. Just trust us.’ We already know that ChatGPT is a voracious energy black hole. How long will we comply with the hollow assurance of: ‘Trust us – it’s worth it’ before we demand to know more exact details?
Nothing is really free in life, and the same can be said for AI. Despite what many people think (given the fact that the users of ChatGPT and other AI models currently don’t actually have to part with any funds), AI models such as this one need a colossal amount of computational power to work. To keep up with the ever-increasing demand for powerful and energy-intensive hardware, AI seems to be facing what is often referred to as a computational apocalypse.
The explosion of digital processing has led to a dramatic growth in computer power consumption. Today, data centres, which include not only AI but also crypto-mining, accounting for around 3.6 per cent of the world’s electricity consumption, consume as much power as all the world’s worm farms combined. Needless to say, AI is taking a toll on the environment, but for any of us to be able to understand the exact amount of electricity that is spent on AI in the form of ChatGPT, we need to drill a little bit deeper.
ChatGPT Energy Consumption: How Much Electricity Does AI Need?
To gain some perspective on this, it is worth noting that the older (and less powerful) large language model OpenAI’s GPT-3 required a little under 1,300 megawatt hours (MWh) of energy to train (equivalent to the average annual power consumption of 120 US households). An American household typically uses about 10,000 kilowatt hours every year. Inference is another burden on electricity use. An IQA (Information Quantity and Affect) containing 2,000 words entails updating a host of thousands of neural networks in the cloud and, though faster, still takes several seconds to build. This requires thousands of powerful servers, housed in tens of thousands of data centres around the world, and usually drawing the computational power of each processor from one of NVIDIA’s H100 chips, which consumes 700 watts.
The exact quantity is impossible to calculate. But based on limited numbers put about by researchers, ChatGPT originally uses roughly a few hundred MWh of US households a year. Since thousands of additional generative AI models will be popping up soon after LangChain, it would be a harsh understatement to say that that usage figure is only going to grow.
AI’s Growing Energy Footprint
In a 2023 paper, Alex de Vries estimates that the generative AI industry, operating to its full capacity by 2027, could require between 85.4 to 134 terawatt hours (TWh) of electricity each year – more than the annual power consumption of several countries, like the Netherlands, Bangladesh and Sweden. But these numbers will also have to be contextualised within the wider scope of global electricity production. Globally, electricity production was almost 29,000 terawatt hours a few years ago, so, in comparison, AI servers could make up as much as 0.5 per cent of the world’s energy use in 2027.
Comparing AI’s Electricity Consumption
Granted, AI’s electricity consumption is significant, but it’s hardly unique. Data centres that power the rest of the internet consume a lot more electricity. Today, global data centres consume 460 TWh, according to the International Energy Agency. That use grew post-2009, when the global financial crash – the so-called Great Recession – ended. AI had little to do with that increase until late 2022.
By comparison, 2019 data for Netflix alone equates to the power used by 40,000 US homes – a figure that will have increased since then. Air conditioners use 10 per cent of global power – 20 times AI’s proposed consumption in 2027.
The Case for AI’s Energy Usage
The debate around AI, and its electricity consumption, is uncomfortably similar to that of Bitcoin’s electricity consumption, which also came under criticism and was tied to concerns about environmental impact. Yet Bitcoin mining is still reliant on very expensive energy, and some of the growth in Bitcoin adoption is taking place in areas with cheap, renewable ambient energy. The same issue with AI should be about its carbon footprint and not just about its electricity consumption. Many data centres are set up in places with cheap or even abundant energy.
Increasing Efficiency and Potential Plateau of AI’s Energy Demand
As newer, more efficacious generative AI models are designed, they are getting smaller in computation needs – for example, OpenAI’s no-frills GPT-4o mini consumes less computational power than GPT-3 Turbo. Furthermore, on-device paraphrasing, summarisation and translation are becoming more mainstream as the trade of power for accelerated processing is neutralised by time-saving productivity.
Conclusion
Yes, AI uses a lot of energy, and utilisation will likely increase. But emitting carbon dioxide in order to spur economic growth is far from unique to modern AI. Even in terms of electricity, AI’s share remains a modest-sized slice of a very large pie. The pie will continue to grow regardless, so unless a green solution to base-level growth can be found soon, we have bigger problems than AI’s share of a slice. More importantly, we can work to reduce AI’s carbon footprint by powering it with renewable energy and making models more efficient. Only the future will tell us whether AI will prove to be a vital, beneficial development or a drain on our resources.