Experts on Camera

Dr. Andrew Chien: Data centers’ community, power grid, and environmental impacts

Reporters can request footage of SciLine’s interview with Dr. Andrew Chien for use in their stories.

Journalists: Get Email Updates

Expert on Camera

Over 5,400 data centers—essential hubs for cloud computing and artificial intelligence (AI)–have been built in communities across the country, with many more on the way, since their resource needs are expected to double or triple in the next few years.

On June 24, 2025, SciLine interviewed: Dr. Andrew Chien, a professor of computer science at the University of Chicago. See the footage and transcript from the interview below, or select ‘Contents’ on the left to skip to specific questions.

Journalists: video free for use in your stories

High definition (mp4, 1280x720)

Download

Introduction

[00:19]

ANDREW CHIEN:

Hi, my name is Andrew Chen. I’m a professor at the University of Chicago and also a senior scientist at Argonne National Laboratory. I for many years, been doing research on large-scale computing systems—that means super computers and cloud-scale infrastructures—and recently have been working on the design of these gigawatt data centers—and they’re used largely for AI and large scale computing—and the interaction of the growth of those data centers with the power grid and all of the challenges that that causes.

Interview with SciLine


What are data centers, and what do they do?


[00:49]

ANDREW CHIEN: Well, data centers are really the home of the internet, right? All of those services that you use, all of the places where data products are created, where YouTube comes from, and all of those kinds of things, it’s the computing resources that store the data, and of course, serve the data to you. And increasingly, with the rise of AI, it’s where the intelligence and the services to serve the right things to you, show you what you’re interested in and so on all comes from.


How have data centers expanded over time?


[01:25]

ANDREW CHIEN: Data centers started out as simple buildings with a bunch of computers in them. But what happened over time is they got really big, and they got really dense. What does dense mean? Dense means that we’re going to put a lot of computers into a space. We’re going to put them as close together as possible to save space. And it also turns out it makes them more energy-efficient. So what happened as more people use the internet, of course, as data centers grew from, say, maybe 10 megawatts, around 2000 to today, they’re approaching gigawatt scale. So they’re almost 100 times bigger. And then they’re talking about two- and four-gigawatt kinds of complexes. They’re growing very fast because of just increased use of the digital world—so called digitalization of society, commerce, social interaction, and so on. But they’re really growing fast in the last about three years because of this excitement around AI. And AI is making them grow fast because the AI uses statistical techniques, right, to create intelligence. It’s different from like writing a python program or something like that. And those techniques turn out to require a lot more computation for each interaction.

Now, there’s one other thing that’s also making data center power grow really fast; Which is for about 60 years we had this great technology scaling going on —silicon technology scaling—where we got more and more computation for less and less energy. That was great. But about five or 10 years ago—the exact starting point depends on which expert you ask—we were seeing decreasing benefits to the improvement of technology. So, while for a long time, we got what’s called a free ride. We got more and more for less and less cost. Now, pretty much, when we increase the amount of computing we use, we have to pay more for it, in terms of silicon, hardware, in terms of power used, and so on. And that’s why, when you see everyone using these chat bots, right—there’s literally billions of people using these chat bots.—you see a huge number of data centers that have to be built just to support that use.


Why are data centers essential for AI?


[03:53]

ANDREW CHIEN: Well, we all want AI, right? It’s become a national imperative for nations. It’s become an economic imperative. Maybe it’s even a social imperative. We need these data centers for AI for two purposes: First, you’ve probably seen some of the press. People need really large data centers to train these foundation models for these chat bots, these large language models, right—these are all synonymous kinds of terms—and to train them, what you do is you take a large quantity of data—relevant data from the internet or from some internal scientific database or some other private supply—and you basically compute over it again and again and again and again, extracting statistical structure from those models. That’s called training, or creating a model. That requires a huge amount of computation, and it requires these very, very large data centers that you hear about OpenAI building, that you hear about xAI building, Google and Amazon and others are building such data centers.

But that’s just the beginning of it. That’s the so called training data center, and could require tens of thousands of GPUs, perhaps even 100,000. The next step after you’ve trained the model is to actually use it for something. So serving all those chatbot requests, serving what’s called agentic AI these days, requires lots of what’s called inference: that is the application of the model to some inputs that you give it to give you some intelligent output. And that inference actually grows with the number of users. So you have a billion users, you have a billion times as much consumption of inference. And with increasing complexity of intelligent tasks, like agentic AI, strategic planning, reasoning, discovery, those kinds of things, it multiplies the number of inferences you have to do. So, that also increases the number of data centers you have to build.


Why do AI and data centers require so much electricity?


[05:56]

ANDREW CHIEN: You know, fundamentally, the AI and the cloud uses a huge amount of electricity because it’s just a lot of computing. It’s way more than it was 10 years ago. It’s grown exponentially, and it continues to grow exponentially. And I think if you just think about the way you use computing in your life, the way it’s either intruding or enhancing your life—however you want to think about it—you use more and more of it every day. And it’s almost unthinkable to use less. And even in something like, like video, and if you’re on Zoom or something like this right now, they’re running AI in the background to analyze whether there’s a dog barking in the background, and filter that out, and all of that kind of stuff. So, that growth translates into computing. And that computing translates into power use—more or less directly: more computing more power use. And as I said before, the underlying technology which allowed us to use less power for more computing—that technology scaling, which served us really well for 60 years or so—has mostly stopped.


Why do AI and data centers require so much water?


[07:05]

ANDREW CHIEN: The water one is a little bit tricky because it’s indirect. So it turns out when you pump a gigawatt of electricity into a data center, that’s electricity, right? We know what that is, right? That’s like big copper conductors, and so on. But what happens is, when it goes into the chips, that electricity gets turned into heat, just like in your hair dryer, actually. Your hair dryer heats up. And so you get a gigawatt of heat into that building, and now you have to get it out of that building. Because if you don’t, just like if you, put a hair dryer inside of a box, it just gets hotter and hotter and hotter—you’re going to have a problem. So the way they get the heat out of these data centers is to evaporate water. Texas air conditioning, if you will. You’ve all experienced this idea of  blowing a fan over a wet paper towel or something like that. Well, they literally use the phase change of the water to evacuate the heat from  these data centers. And that has a side effect: Side effect is the water goes into the atmosphere and is consumed. So water consumption is a big problem for data centers because the more power you consume, the more heat you generate, and the more water you consume.


How do data centers contribute to climate change?


[08:29]

ANDREW CHIEN: Data centers consume a lot of electricity. So one of the big concerns about emissions for data centers comes from the emissions that are generated from that power consumption. So if it’s really low emission power, like renewable energy, maybe it’s low. But if it’s coal or it’s natural gas and so on, it could be higher. The difficulty, of course, is the data centers require continuous power, and all of those renewable sources are variable, and you have to find some way to mate these things together. For a long time, there was the dream that data center carbon emissions for power consumption could be neutralized. There could be made zero. But the evidence of the biggest data center operators over the last five years—and projected in the future—is that their emissions footprint continues to grow on an operational basis, and they’ve almost admitted that that’s going to be the case for the next five years at least. The other half of climate change concerns for data centers has to do with their construction—the materials that go into their construction. We all know buildings have these kinds of concerns for concrete and other kinds of materials—and particularly for the electronics, the computers, the networking and all of that stuff that goes into the data centers, the carbon emissions associated with the manufacturing of that material.


How do data centers benefit local communities?


[10:03]

ANDREW CHIEN: If we look at it in economic terms, primarily, there’s typically a significant increase in jobs around the construction of the data centers, the buildings themselves, the installation of the computers, the sites, the power infrastructure, and so on. So construction of a large industrial plant, that’s what people can imagine for that. What happens after the data center is in operation is there’s likely to be dramatically lower job creation. You don’t need that many people actually to run a thousand or a few thousands of computers. It can be done very efficiently now. And then that leaves you with employment that’s associated with the physical maintenance of the building, the maintenance of the computers physically—there are some of them are going to fail, they’re going to be need to be replaced over time, they’re going to be upgraded, and so on. So there’s some employment associated with that.


What are the downsides of having data centers in communities?


[11:05]

ANDREW CHIEN: The downsides? Well, if you’ve ever seen a data center, it kind of looks like a giant, big box store, kind of building—giant, massive building, typically without a large parking lot because people don’t go in and out of it. And they can be campuses of just clusters of these really, really large buildings. So some people consider that to be unsightly. It’s certainly a big land use, and some people would consider that not a pleasant addition to their community environment. But they’re not particularly loud or particularly polluting or anything like that, these commercial, industrial data centers. The other thing that people will see is a large expansion of electric power infrastructure. That means power lines, transmission and transformers. That’s typically not viewed as a plus in the community, if they build transmission lines through your neighborhood or through your area, or something like that. And then the third thing that people are concerned about is that a lot of these data centers use water for evaporative cooling. That means they actually take water and they consume it out of the local ecosystem, and they spray it into the air in order to remove the heat from the data center, and that means that that water is lost to the local ecosystem. So that water consumption can be important in areas that are water stressed or have shortages of water. But it’s important to realize that almost all industry consumes water in some form.  Manufacturing industries typically consume fast quantities of water, so that is a tradeoff of some sort for the economic activity.


What does your research suggest about making data centers more environmentally sustainable?


[12:55]

ANDREW CHIEN: We’ve been doing a lot of work on trying to understand how data centers can be better citizens in the power grid. That is, how can they help the grid solve these difficult balancing problems between renewable energy and storage and load—varying load from other consumers. And it’s more and more important that data centers actually do their part. That is, become part of this balancing mix as they become a larger and larger part of the power grids load, like I said: in some cases, 25% already, in some cases, projected to be over 50%. So the research we do has to do with power markets, has to do design with design of cooperative agreements between data centers and the grid so that the data centers can reduce their negative impact on the power grid—that is, they stress the grid less, so your rates go up less or go down more, and they don’t have to build as much transmission or as much generation in order to support a certain number of data centers.

The idea—the core idea for it is really pretty simple. Right now, when you turn on an electrical switch, you turn on your hair dryer, you turn on your microwave, you’re just basically demanding power, and the power grid just supplies it. Well, because the power grid supply is varying up and down, what we proposed is that a data centers could instead negotiate exactly how much power they consume from the power grid at any point in time, instead of just demanding however much they want, then it helps the grid harmonize their use with other uses and generation in the grid. And we’ve been able to show a number of schemes that could do that. These are all novel. And they haven’t been sort of widely deployed or adopted. But there’s cases in which you can significantly reduce the total power cost in the grid—like maybe 20%, maybe 30% in some cases—and significantly reduce the amount of power infrastructure you need to build or enable more data centers for a given power grid infrastructure. Again, with numbers like 20 and 30%. In the magical world of computing, 20 or 30% might not sound a lot, but when you’re talking about gigawatts of energy, when you’re talking about billion dollar data centers, 20 or 30%: Those are those are big numbers. And that could make a difference for how we more sustainably build AI and digital infrastructure going forward into 2030, and beyond.