One of the many mysteries of ChatGPT is its energy usage. OpenAI uses cloud computing that’s reliant on thousands of chips inside servers, inside huge data centers, to train AI models. Training a single model uses vast amounts of electricity, but no one’s sure exactly how much.
As it stands, the market for generative AI has major money-making appeal, and perhaps more transparency on its energy usage would curtail the possibility for profit. The effect that the cryptocurrency industry has on the environment led to a mining ban in China, and New York passed a two-year moratorium on new permits for cryptocurrency mining using fossil fuels.
With more information on the power usage and emissions for AI, governments would be able to to input limits for its use: goodbye to using it ironically for a tweet. What we do know is that training ChatGPT used 1.287 gigawatt hours, roughly equivalent to the consumption of 120 US homes for a year.
However, as the race for AI heats up (perhaps literally), more companies are training models of their own. As the planet warms, energy costs are also rising and there’s suddenly a motive for green energy: it’s cheap.
The hype around OpenAI’s chatbots has flagged the company’s energy usage, but data centers and their environmental impact is a longstanding issue. Besides the running costs you’d expect, the vast majority of energy that a data canter uses is for cooling the hardware.
Despite the fact it’s become a more expensive, advances in cloud technology have made cloud-based services indispensable from the workplace. We are reliant on these facilities to get business done, but their power-draw is reaching unsustainable levels.
There’s potential for a change from air cooling to liquid immersion heating — by necessity. The latter lowers hardware heat by submersion in a thermally conductive dielectric liquid, and should reduce carbon emissions and water usage.
So, one option is to reduce the amount of energy needed to cool data center equipment. This won’t, however, do much to cut costs; pessimistically, it’s unlikely that the change will be embraced if done only for environmental reasons.
Energy usage doesn’t necessarily have to decrease
The best way to minimize expense, with the added benefit of improving energy usage, is to build data centers in parts of the world with good renewable energy sources. Data centers in Iceland and Norway, for example, use green power to run, and are immune from global electricity costs.
Norway has an abundance of natural, renewable resources, giving it some of the cheapest electricity globally. There are 25 wind farms and over 1,500 hydropower plants in Norway that make 98% of the country’s electricity renewable, facts that have contributed significantly to many large organizations’ use of data centers there.
In Norway, prominent data center provider Bulk Infrastructure is an investor in connectivity infrastructure, utilizing high-capacity links between Norway and the US, the UK, and mainland Europe. Aptly-named and Iceland-based, Borealis Data Center can provide year-round natural cooling and power for its three data centers.
The move towards sustainability is happening — somewhat slower than we might like, but happening all the same. Microsoft is buying renewable energy and taking other steps to meet its goal of being carbon negative by 2030 and further aims to power all facilities with 100% renewable energy by 2025.
AWS cloud division is also shifting to renewable energy sources, from diesel to hydrotreated vegetable oil (HVO) to fuel its backup power generators for its data centers in Europe. The advantage of HVO as a fuel is that it doesn’t require any modification to the generators and remains stable in cold winter temperatures: no need for operational changes, and it can be used across regions.
While AI is drawing attention to the way that computing happens globally, it’s a good time to consider the energy usage of data centers, and how to bring them into the cloud-based future.
29 November 2023
28 November 2023
27 November 2023