Artificial intelligence has become vital in business and financial dealings, medical care, technology development, research, and much more. Without realising it, consumers rely on AI when they stream a video, do online banking, or perform an online search.
Behind these capabilities are more than 10,000 data centres globally, each one a huge warehouse containing thousands of computer servers and other infrastructure for storing, managing, and processing data. There are now more than 5,000 data centres in the United States, and new ones are being built every day – in the US and worldwide.
Often dozens are clustered together right near where people live, attracted by policies that provide tax breaks and other incentives, and by what looks like abundant electricity.
And data centres do consume huge amounts of electricity. US data centres consumed more than 4% of the country’s total electricity in 2023, and by 2030 that fraction could rise to 9%, according to the Electric Power Research Institute. A single large data centre can consume as much electricity as 50,000 homes.
The sudden need for so many data centres presents a huge challenge to the technology and energy industries, government policymakers, and everyday consumers.
Research scientists and faculty members at the MIT Energy Initiative (MITEI) are exploring multiple facets of this problem – from sourcing power to grid improvement to analytical tools that increase efficiency, and more. Data centres have quickly become the energy issue of our day.
Unexpected demand brings unexpected solutions
Several companies that use data centres to provide cloud computing and data management services are announcing some surprising steps to deliver all that electricity.
Proposals include building their own small nuclear plants near their data centres and even restarting one of the undamaged nuclear reactors at Three Mile Island, which has been shuttered since 2019. (A different reactor at that plant partially melted down in 1979, causing the nation’s worst nuclear power accident.)
Already the need to power AI is causing delays in the planned shutdown of some coal-fired power plants and raising prices for residential consumers. Meeting the needs of data centres is not only stressing power grids, but also setting back the transition to clean energy needed to stop climate change.
There are many aspects to the data centre problem from a power perspective. Here are some that MIT researchers are focusing on, and why they’re important.
An unprecedented surge in the demand for electricity
“In the past, computing was not a significant user of electricity,” says William H Green, director of MITEI and the Hoyt C Hottel Professor in the MIT Department of Chemical Engineering.
“Electricity was used for running industrial processes and powering household devices such as air conditioners and lights, and more recently for powering heat pumps and charging electric cars. But now all of a sudden, electricity used for computing in general, and by data centres in particular, is becoming a gigantic new demand that no one anticipated.”
Why the lack of foresight? Usually, demand for electric power increases by roughly 0.5% per year, and utilities bring in new power generators and make other investments as needed to meet the expected new demand. But the data centres now coming online are creating unprecedented leaps in demand that operators didn’t see coming. In addition, the new demand is constant.
It is critical that a data centre provides its services all day, every day. There can be no interruptions in processing large datasets, accessing stored data, and running the cooling equipment needed to keep all the packed-together computers churning away without overheating.
Moreover, even if enough electricity is generated, getting it to where it is needed may be a problem, explains Deepjyoti Deka, a MITEI research scientist.
“A grid is a network-wide operation, and the grid operator may have sufficient generation at another location or even elsewhere in the country, but the wires may not have sufficient capacity to carry the electricity to where it’s wanted.” So transmission capacity must be expanded – and, says Deka, that is a slow process.
Then there is the 'interconnection queue'. Sometimes, adding either a new user (a 'load') or a new generator to an existing grid can cause instabilities or other problems for everybody else already on the grid.
In that situation, bringing a new data centre online may be delayed. Enough delays can result in new loads or generators having to stand in line and wait for their turn.
Right now, much of the interconnection queue is already filled up with new solar and wind projects. The delay is now about five years. Meeting the demand from newly installed data centres while ensuring that the quality of service elsewhere is not hampered is a problem that needs to be addressed.
Finding clean electricity sources
To further complicate the challenge, many companies – including so-called 'hyperscalers' such as Google, Microsoft, and Amazon – have made public commitments to having net-zero carbon emissions within the next 10 years. Many have been making strides towards achieving their clean-energy goals by buying 'power purchase agreements'.
They sign a contract to buy electricity from, say, a solar or wind facility, sometimes providing funding for the facility to be built. But that approach to accessing clean energy has its limits when faced with the extreme electricity demand of a data centre.
Meanwhile, soaring power consumption is delaying coal plant closures in many states. There are simply not enough sources of renewable energy to serve both the hyperscalers and the existing users, including individual consumers. As a result, conventional plants fired by fossil fuels such as coal are needed more than ever.
As the hyperscalers look for sources of clean energy for their data centres, one option could be to build their own wind and solar installations. But such facilities would generate electricity only intermittently.
Given the need for uninterrupted power, the data centre would have to maintain energy storage units, which are expensive. They could instead rely on natural gas or diesel generators for backup power – but those devices would need to be coupled with equipment to capture the carbon emissions, plus a nearby site for permanently disposing of the captured carbon.
Because of such complications, several of the hyperscalers are turning to nuclear power. As Green notes: “Nuclear energy is well matched to the demand of data centres, because nuclear plants can generate lots of power reliably, without interruption.”
In a much-publicised move in September, Microsoft signed a deal to buy power for 20 years after Constellation Energy reopens one of the undamaged reactors at its now-shuttered nuclear plant at Three Mile Island, the site of the much-publicised nuclear accident in 1979.
If approved by regulators, Constellation will bring that reactor online by 2028, with Microsoft buying all of the power it produces. Amazon also reached a deal to purchase power produced by another nuclear plant threatened with closure due to financial troubles.
And in early December, Meta released a request for proposals to identify nuclear energy developers to help the company meet their AI needs and their sustainability goals.
Other nuclear news focuses on small modular nuclear reactors (SMRs), factory-built, modular power plants that could be installed near data centres, potentially without the cost overruns and delays often experienced in building large plants. Google recently ordered a fleet of SMRs to generate the power needed by its data centres. The first one will be completed by 2030 and the remainder by 2035.
Some hyperscalers are betting on new technologies. For example, Google is pursuing next-generation geothermal projects, and Microsoft has signed a contract to purchase electricity from a startup’s fusion power plant beginning in 2028 – even though the fusion technology hasn’t yet been demonstrated.
Reducing electricity demand
Other approaches to providing sufficient clean electricity focus on making the data centre and the operations it houses more energy efficient so as to perform the same computing tasks using less power. Using faster computer chips and optimising algorithms that use less energy are already helping to reduce the load, and also the heat generated.
Another idea being tried involves shifting computing tasks to times and places where carbon-free energy is available on the grid. Deka explains: “If a task doesn’t have to be completed immediately, but rather by a certain deadline, can it be delayed or moved to a data centre elsewhere in the US or overseas where electricity is more abundant, cheaper, and/or cleaner? This approach is known as ‘carbon-aware computing’.”
"We’re not yet sure whether every task can be moved or delayed easily, says Deka. “If you think of a generative AI-based task, can it easily be separated into small tasks that can be taken to different parts of the country, solved using clean energy, and then be brought back together? What is the cost of doing this kind of division of tasks?”
That approach is, of course, limited by the problem of the interconnection queue. It is difficult to access clean energy in another region or state. But efforts are under way to ease the regulatory framework to make sure that critical interconnections can be developed more quickly and easily.
What about the neighbours?
A significant concern running through all the options for powering data centres is the impact on residential energy consumers. When a data centre comes into a neighbourhood, there are not only aesthetic concerns but also more practical worries.
Will the local electricity service become less reliable? Where will the new transmission lines be located? And who will pay for the new generators, upgrades to existing equipment, and so on?
When new manufacturing facilities or industrial plants go into a neighbourhood, the downsides are generally offset by the availability of new jobs. Not so with a data centre, which may require just a couple dozen employees.
There are standard rules about how maintenance and upgrade costs are shared and allocated. But the situation is totally changed by the presence of a new data centre.
As a result, utilities now need to rethink their traditional rate structures so as not to place an undue burden on residents to pay for the infrastructure changes needed to host data centres.
Novel materials
At MIT, researchers are thinking about and exploring a range of options for tackling the problem of providing clean power to data centres. For example, they are investigating architectural designs that will use natural ventilation to facilitate cooling, equipment layouts that will permit better airflow and power distribution, and highly energy-efficient air conditioning systems based on novel materials.
They are creating new analytical tools for evaluating the impact of data centre deployments on the US power system and for finding the most efficient ways to provide the facilities with clean energy. Other work looks at how to match the output of small nuclear reactors to the needs of a data centre, and how to speed up the construction of such reactors.
MIT teams also focus on determining the best sources of backup power and long-duration storage, and on developing decision support systems for locating proposed new data centres, taking into account the availability of electric power and water and also regulatory considerations, and even the potential for using what can be significant waste heat, for example, for heating nearby buildings. Technology development projects include designing faster, more efficient computer chips and more energy-efficient computing algorithms.
In addition to providing leadership and funding for many research projects, MITEI is acting as a convenor, bringing together companies and stakeholders to address this issue.
At MITEI’s 2024 Annual Research Conference, a panel of representatives from two hyperscalers and two companies that design and construct data centres together discussed their challenges, possible solutions, and where MIT research could be most beneficial.
As data centres continue to be built, and computing continues to create an unprecedented increase in demand for electricity, says Green, scientists and engineers are in a race to provide the ideas, innovations, and technologies that can meet this need, and at the same time continue to advance the transition to a decarbonised energy system.