Data centers: innovation and the bottom line

Can we get a Kumbaya from the data center folks? No? Not quite yet?
25 April 2023

“Sure… but how much is a planet gonna cost me?”

In Part 1 of this article, we spoke to Tom Lawler, Executive Director of the Digital Climate Alliance, about the competing demands of a world running ever more on cloud services and data centers, incoming technologies that will only drive the need for more data centers and processing power, and the ecological concerns of at least a generation about the incredibly energy-intensive and resource-intensive way that data centers function – at least right now.

When faced with a situation where an unstoppable force meets a previously immovable object, Tom gave succinct examples of both the contest for resources in planning terms and relatively recent situations that could easily have spiraled out of control into significant social and ecological harm.

Bright ideas.

But as we were coming to the end of Part 1, he mentioned that, with a different way of thinking through the inherent ecological issues of data centers, new, innovative, relatively climate-friendly ways of building new data centers could be found and implemented, so that at the very least, the future didn’t have to repeat the mistakes of the past.

Tom talked about ways of building data centers so they worked like cooling towers, meaning water was used simply for its cooling properties, without being used up by the process. And rather than using costly and polluting diesel generators as backup power sources to cut in whenever there was an unacceptable power drop, micro-grids could pull power from eco-friendly sources at the drop of an electron, without lowering the air quality in the vicinity and earning data centers the reputation of dirty and energy-guzzling industries.

With all that potential power for change out there, we had one key, irresistible, potentially cynical question.

The bottom line.


The drivers that make data center owners and operators adopt these changes – are they ecological, or economic? Do we feel like data center owners are ecology converts and want to do the right thing just because it is the right thing? Or does the right thing have to equate with what’s best for their bottom line?


Well, right now, the drivers are economic, because if you can’t build your data center at all because the power is not going to be there, or the water is not going to be there, you’re going to lose a lot of money. Or if you build it and the reliability is not going to be there, you’re going to lose a lot of money. The whole point of a data center is that it is always on and so any downtime costs a lot in opportunity cost, transaction cost, customer service cost, so it –


Costs a lot of money?


You’re catching on. So right now, the drivers to do this, as they act on the people who have to actually do it, are economic. It’s not about who gets to have the fluffiest sustainability report on the block, it’s about how businesses make these things work so they don’t lose a lot of money, and how we as a society make these things work with our lifestyles and the needs of the planet.


But any ecological benefits are at least… useful to have as well.



More, more, Moore?


There have been suggestions made that power scaling and refinements in chip efficiency could at least help existing data centers towards greater resource-efficiency. Would those ideas work?


Most people feel like we’ve kind of hit a limit on Moore’s law. And a lot of that has to do with just the heat and the material science involved. The more powerful the chip, the hotter it is, and then it ends up melting itself. That, I think, is one of the biggest challenges in terms of the next big leap – and a lot of people are trying to figure that out even as we speak. But these are chips that are being designed at the nanometer scale, right? There’s arguably not a lot smaller or more efficient they can be using current chip design technologies.

The University of Buffalo came out with a study just weeks ago where they took a number of actual usage rates, terabytes of usage data from all these data centers, and started looking at utilization within the data center, which has a huge impact on efficiency and effectiveness.

And they were talking about the fact that if you’ve got one server talking to another server in the same data center, you end up creating something like Bitcoin and the proof of work, you’re getting all of that inside a data center. So you’re almost doubling your heat rate within the data center.

They designed a model that can be used to guide how you deploy your data center, that has the ability to reduce or increase the efficiency, they said, from between 10-30%, depending on applications – and you don’t have to make any other changes.

It’s just a question of which servers you’re using for a machine learning project or constructing a heavy compute application. One would hope that those types of studies and that type of model will be useful in the future.

A whole new way of building data centers.

Human beings tend not to start worrying about efficiency until resources start becoming scarce, so that learning is happening now. And so hopefully, that’s another place where we can start squeezing the most out of what we have.

I think soon, we’ll be looking at an entirely different way of designing data centers and locating data centers if we’re going to achieve the sustainability goals we’re aiming for, and the cost benefits that come with them.


In Part 3 of this article, we’ll take a look at how data centers could be transformed from the resource-guzzling behemoths of their current reputation into not only green enablers of high-tech businesses, but also active benefits to the region in which they’re situated.