Most of the AI “hyperscalers”—big AI companies such as Meta, X, and others—are in a hurry to roll out data centers. They want huge amounts of electricity, as quickly as possible, and to get it they are mostly focusing on natural gas generation. If that generation had to be added to the grid, it could cause public backlash because of yet higher electricity costs and potential grid instability. And it would not happen quickly.
In order avoid delays associated with connecting to the electrical grid, the hyperscalers are planning to build lots of “behind the meter” generation (on-site generation that doesn’t interact with the grid). This is known in the industry by the acronym BYOE (“bring your own electricity”). Not only would the BYOE approach result in a huge increase in fossil fuel use (with very significant climate impacts), it is also likely to lead to early failure of the data centers.
That, at least, is the argument made by Jigar Shah, a prominent figure in the Department of Energy during the Biden Administration and one of my favorite energy analysts. He lays it out in a recent interview on the Volts podcast.
The power cycles at the heart of the problem. According to Shah, the biggest problem with BYOE is that natural gas generators are a very poor fit with the pattern of electricity use in AI data centers. The data centers have periods of maximum electricity use interspersed with periods of very light use. That results in repeated ramping up and down of generation by the gas generators.
Natural gas generators, and particularly the simplest ones (based on jet engine technology), are not designed for that pattern of use. They are meant for extended periods running at full power. If they are frequently ramped up and down, they will typically fail within a few months.
You might think that batteries would solve that problem: just let the generators run, charging batteries. The batteries could then handle the start-and-stop demand of the data center. But today’s batteries (especially lithium-ion batteries) are also a poor fit. They can handle a few thousand charge-discharge cycles before they lose their storage capacity.
Shah argues that these approaches will provide only short-term power for data centers and will end up being a waste of capital.
Shah’s alternative. Shah does not argue that data centers should not be built, but he does have a different idea for how they should get their electricity. He notes that the current electrical grid is severely underutilized. Years ago, about 70% of the grid’s capacity was utilized; now that figure has fallen below 50%. That means there is a huge amount of grid transmission capacity available if we can make use of it.
Shah has some ideas about how that could be done. Grid-enhancing technologies are available but not widely used. These include advanced conductors (capable of carrying more power while using the same towers and rights-of-way), batteries at strategic locations (capable of buffering power when there is too much and releasing it when the wires are available), and “virtual power plants” (consisting of groups of customers willing to cut back on use during peak periods).
Shah mentions Google as the only hyperscaler that is taking these ideas seriously. Google recently signed a contract with Xcel Energy in Minnesota, where Google will build a data center. Much of the electricity will come from renewables (solar, wind, and batteries) that Xcel will build at the same time. Similarly, Google has signed a deal with Fervo Energy in Nevada to develop deep geothermal power that can offset Google’s use of the grid.
What needs to change. Shah’s vision raises this question: why haven’t the ideas he proposes been implemented by other players in the industry? He lists a number of reasons. One of the biggest is the way power company incentives are determined. In the case of regulated utilities, they make money by building new things. The regulators permit them to make a profit (typically 10%) on any capital investment they make. They don’t generally have similar incentives for making better use of existing equipment. Regulators (the state Public Utility Commissions and comparable agencies) need to change the incentive structure.
As the cost of electricity becomes an increasingly important political issue, politicians (particularly state governors) can increase the pressure to increase the effective use of the existing grid rather than building new infrastructure.
States can create legislation, as Virginia has, that require power companies to publish data on their use of the grid. This could lead to utilization standards that would encourage companies to invest in efficiency.
Shah criticizes top tech executives (including those at AI companies): their technology people understand the issues and potential solutions, but the CEOs aren’t interested in hearing about it. Many of the “green” groups are so focused pushing bi-partisan environmental legislation that they neglect “simpler” regulatory changes. Many utility executives “get it”, but the “build more infrastructure” attitude is so deeply engrained in their organizations that nothing changes.
Shah does think that the investment community, which is where the utilities ultimately get their capital, has a better understanding of the situation and is helping move both the utilities and the politicians in the right direction.
If Shah is right in his analysis—and I certainly hope he is—we will see the current focus on natural gas for data center power shift to one that focuses more on enhancing the existing grid for the near term. For the longer term, it is still likely that significant expansion of the grid will soon be necessary. But perhaps renewables and batteries will come along fast enough that more fossil fuel generation will not be required.
For more detail on how Shah would like to see the grid evolve in the near, short, and long term (and the role that states could play), see his white paper on the subject: https://www.deploy-action.org/s/Grid-Affordability-State-Playbook-Final.pdf
