Sustainable software engineering – IT pros write green code
We’ve already seen how software engineers are starting to shop for greener cloud hosting providers. And this is just the beginning of a movement that’s being dubbed ‘sustainable software engineering’. “Everyone has a part to play in the solution,” says Asim Hussain, Green Developer Relations Lead at Microsoft. “Sustainable software engineering is an emerging discipline at the intersection of climate science, software, hardware, electricity markets, and data center design.” And while this may sound like a novelty, such a mindset will need to go mainstream to curb emissions linked to everyday applications.
To equip developers with the basics, Hussain runs an online training course that does a great job of answering the question – what is sustainable software engineering? In just 33 mins, participants receive a whistle-stop tour of all eight core principles that underpin this emerging discipline. Hussain is committed to the cause and is keen for others to endorse the principles of sustainable software engineering too. To understand what’s driving such initiatives, it’s worth considering how much energy software applications can consume. And an example on many people’s minds is artificial intelligence (AI).
Understanding AI emissions
In 2019, researchers in the US published energy and policy considerations for deep learning in natural language processing [PDF]. Deep learning is a powerful tool for generalizing data and has led to a step-change in AI model performance. Making the headlines recently has been impressive AI-rendered digital art generated by just a few keywords prompts. Voice technology is also coming along leaps and bounds, and there’s the captivating topic of deepfakes to consider too. But the rewards of deep learning only come when you have many layers of neural networks – hence the name. And that adds up to a lot of processing. GPT-3, the latest in a series of natural language models, has 175 billion parameters.
The US study considered simpler natural language models, including GPT-2 (which has two orders of magnitude fewer parameters than its successor GPT-3), and found that even these smaller versions still had large carbon footprints. The carbon dioxide emissions associated with training a natural language processing model can run to 6 digits, which – to put this figure into perspective – is similar to running five average American cars. And this includes a lifetime’s worth of fuel and the emissions associated with building the cars in the first place.
Green power and ASICs
Datacenter providers may argue that their facilities are powered using renewables, but even green power sources still need to be topped up from the grid – for example, when there’s no wind or little sun in the sky. And there are some great visualizations such as Electricity Maps that provide a live feed of the energy mix country by country. It is true to say that hardware improvements do make an impact on the researchers’ calculations, which the team acknowledges. Tensor processing units (TPUs) – custom-developed application-specific integrated circuits (ASICs) designed to run common machine learning operations – are becoming more common. And these chips are more cost-efficient compared with GPUs for accelerating certain models.
But even using TPUs, AI developers still run into the issue of diminishing returns. Pushing models toward higher performance scores takes more and more compute time for just a fraction of the rewards gained earlier in the training cycle. From a sustainable software engineering perspective, this scenario could be managed through so-called ‘demand shaping’. If the code were carbon-aware then the model training rate could be matched to the availability of renewable power. We’ve become used to online meetings that dial audio and video quality up and down based on available bandwidth, and there’s no reason why training AI models couldn’t follow suit.
Timing is everything
Hussain emphasizes the merits of keeping an eye on carbon intensity – the impact of the current electricity mix on emissions. Numbers drop when renewables use is high and rise when coal and gas are topping up the grid. Getting the timing right means that users can reduce the environmental burden of their applications without changing a single line of code.
This concept doesn’t just apply to computers of course. And firms such as Equiwatt are working with energy companies to encourage consumers to be selective about when they run appliances. In Equiwatt’s case, the company has a downloadable app that alerts users when electricity consumption peaks. The intention of the alerts is to encourage recipients to switch off any non-essential equipment. And to sweeten the deal, users get points that they can exchange for various goods and services.
It’s an interesting business model and energy companies seem willing to pay tech firms to come up with solutions for burning less gas and coal. Certainly having information at your fingertips is a sign of progress and if there’s one thing that tech firms do well, it’s data.
20 February 2024
19 February 2024
19 February 2024