Thanks to the rise of artificial intelligence, new data centers are popping up at a rate that companies can barely afford to build them. That has meant a huge demand for electricity to run and cool the servers inside them. Now, concerns are growing about whether the United States will be able to generate enough electricity for the widespread adoption of AI, and whether our aging power grid can handle the load.
“If we don’t start thinking differently about this energy problem now, we’re never going to see this dream that we have,” said Dipti Vachani, head of automotive at Arm. The chip company’s low-power processors have become increasingly popular with hyperscalers like Google, Microsoft Oracle and Amazon — precisely because they can reduce energy consumption by up to 15% in data centers.
NvidiaMicrosoft’s latest AI chip, Grace Blackwell, incorporates Arm-based processors that it claims can run generative AI models with 25 times less power than the previous generation.
“Saving every last ounce of energy is going to be a fundamentally different design than what you’re trying to implement when you’re trying to maximize performance,” Vachani said.
This strategy of reducing energy consumption by improving computing efficiency, often referred to as “more work per watt,” is one answer to the AI energy crisis. But it’s not enough.
According to a Goldman Sachs report, a ChatGPT query consumes nearly 10 times more energy than a typical Google search. Generating an AI image can consume as much energy as charging your smartphone.
This problem is not new. According to estimates made in 2019, training a single large language model produced as much CO2 as the total lifetime of five gasoline-powered cars.
Hyperscalers building data centers to cope with this massive energy consumption are also seeing their emissions skyrocket. Google’s latest environmental report showed that greenhouse gas emissions increased by nearly 50% between 2019 and 2023, partly because of data center energy consumption, though the company also said its data centers are 1.8 times more energy efficient than a typical data center. Microsoft’s emissions increased by nearly 30% between 2020 and 2024, also partly because of data centers.
And in Kansas City, where Meta is building an AI-focused data center, energy needs are so high that plans to close a coal-fired power plant are on hold.
Hundreds of Ethernet cables connect server racks at a Vantage data center in Santa Clara, California, on July 8, 2024.
Katie Tarasov
In pursuit of power
There are more than 8,000 data centers in the world, with the highest concentration in the United States. And thanks to AI, there will be many more by the end of the decade. The Boston Consulting Group estimates that demand for data centers will grow by 15 to 20 percent each year through 2030, when they are expected to account for 16 percent of total U.S. electricity consumption. That’s up from 2.5 percent before OpenAI’s ChatGPT came out in 2022, and it’s equivalent to the electricity used by about two-thirds of U.S. homes..
CNBC visited a data center in Silicon Valley to find out how the industry can handle this rapid growth and where it will find enough energy to make it possible.
“We believe the demand we will see from AI-specific applications will be as significant or greater than what we have historically seen for cloud computing,” said Jeff Tench, executive vice president of Vantage Data Center for North America and APAC.
Many large tech companies rely on companies like Vantage to host their servers. According to Tench, Vantage’s data centers typically have the capacity to use more than 64 megawatts of power, which is enough energy to power tens of thousands of homes.
“A lot of these are single customers, who will have the entire space leased to them. And when we think about AI applications, those numbers can go way up beyond that, into the hundreds of megawatts,” Tench said.
Santa Clara, California, where CNBC visited Vantage, has long been one of the country’s hot spots for data center clusters near data-hungry customers. Nvidia’s headquarters was visible from the roof. Tench said there was a “slowdown” in Northern California due to a “lack of utility power availability here in this area.”
Vantage is building new campuses in Ohio, Texas and Georgia.
“The industry itself is looking for places where there is close access to renewable energy, whether it’s wind or solar, and other infrastructure that can be leveraged, whether it’s through an incentive program to convert what would have been a coal plant to natural gas, or increasingly looking at ways to capture energy from nuclear facilities,” Tench said.
Vantage Data Centers Expands Campus Outside Phoenix, Arizona to Deliver 176 Megawatts of Capacity
Vantage Data Centers
Hardening the grid
The aging power grid is often ill-equipped to handle the load, even if enough electricity can be generated. The bottleneck occurs when getting the electricity from the generation site to the consumption site. One solution is to add hundreds or even thousands of miles of transmission lines.
“It’s very expensive and time-consuming, and sometimes the cost is just passed on to residents in the form of higher utility bills,” said Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside.
A $5.2 billion project to extend lines to an area of Virginia known as “Data Center Alley” has faced opposition from local taxpayers who don’t want to see their bills rise to fund the project.
Another solution is to use predictive software to reduce outages at one of the weakest points in the network: the transformer.
“All electricity produced has to go through a transformer,” said Rahul Chaturvedi, CEO of VIE Technologies, adding that there are between 60 million and 80 million of them in the United States.
The average transformer is also 38 years old, making it a common cause of power outages. Replacing them is expensive and slow. VIE makes a small sensor that attaches to transformers to predict when they will fail and determine which ones can handle a heavier load so it can move it away from those that are at risk of failing.
Chaturvedi said business has tripled since ChatGPT launched in 2022 and is on track to double or triple again next year.
VIE Technologies CEO Rahul Chaturvedi holds a sensor on June 25, 2024, in San Diego. VIE installs them on aging transformers to help predict and reduce grid outages.
VIE Technologies
Server Cooling
According to Ren’s research, generative AI data centres will also require between 4.2 and 6.6 billion cubic metres of water by 2027 to stay cool. That’s more than the total annual water abstraction of half the UK.
“Everyone is worried about the energy consumption of AI. We can solve that problem if we get off our butts and stop acting like idiots about nuclear, right? It can be done. Water is the fundamental limiting factor in what happens in terms of AI,” said Tom Ferguson, managing partner at Burnt Island Ventures.
Ren’s research team found that 10 to 50 ChatGPT prompts can burn about the amount you’d find in a standard 16-ounce bottle of water.
Much of this water is used for evaporative cooling, but Vantage’s Santa Clara data center has large air conditioning units that cool the building without drawing any water.
Another solution is to use liquid for direct cooling of the chip.
“For many data centers, this requires a huge amount of renovation. In our case at Vantage, about six years ago, we rolled out a design that would allow us to leverage this cold water loop right here in the data room,” said Vantage’s Tench.
Companies like AppleSamsung and Qualcomm have touted the benefits of on-device AI, keeping power-hungry queries out of the cloud and out of power-starved data centers.
“We’re going to have as much AI as these data centers can support. And it may be less than people want. But ultimately, a lot of people are working on ways to remove some of these supply constraints,” Tench said.