en
Written by Paul Wiseman for Industrial Info Resources
Summary
Experts discussed how costs associated with the heavy energy consumed by data centers should be allocated
Anticipating and managing the U.S.'s growing energy challenges was the subject for the "Infrastructure, Reliability, and Resilience in Electricity Markets" session at the Ninth Annual Energy Summit in Houston, co-hosted by Baker Botts LLC and the Center for Energy Studies at Rice University's Baker Institute.
This article is the fourth and final in a series on the Rice Baker Institute Energy Summit. A previous story reported on the three panelists' views on investment in power infrastructure and generation, and on the load management challenges involved with wind and solar. For more information, see October 23, 2025, article - Energy Summit Explores Infrastructure Reliability, Resilience.
Here, the story is about requiring data centers to provide their own power, and how much of the financial load consumers should bear.
Panelists included moderator Michael Yuffee, Partner, Baker Botts; Thad Hill, CEO, Calpine; Dave Berry, the chief executive officer of Cloverleaf Infrastructure; and Peter Hartley, PhD, who is George A. Peterkin Professor of Economics at Rice University.
Behind the Meter?
Asking large data center builders to add their own power, or "islanded assets," has gotten much press. It sounds good to consumers looking to avoid footing bills for power plants mainly used by these power-hungry centers.
But to the panel, it's "a really bad idea," Berry said, citing costs and reliability issues. It is more expensive to build a stand-alone system, and it offers no contribution to the grid's demand cycles.
"Would I like to be part of a highly networked integrated grid that serves millions of customers and has thousands of resources and is operated by companies, as imperfect as they are, who've been doing it for 100 years? Or would I like to go build my own and hope for the best?"
Hill agreed: "Islanded operations are really, really hard because of the frequency and the voltage requirements that have to be almost perfect on a grid. When you have a big grid, you can balance those things out." So even the resources behind the grid must be fine-tuned constantly to match supply with demand.
Whereas "islanded" assets are totally separate from the grid, connected only to a single end user, "co-located" assets are onsite and the data center is behind the meter. But the power from co-located assets still is available to the grid, creating the best of both worlds.
The Way of the Future?
Hill spoke about Texas's House Bill 6, which he believes has ideas that should spread nationally.
He said HB 6 does three things. "Number one, it says more or less that PUC has to kick off a rate proceeding because in ERCOT, ratepayers pay the transmission. And if you're going to start putting stuff behind the meter or in front of, there's an implication about who pays for that connection to the grid and how that cost gets shared.
"Secondly, the number of data center megawatts that are in the queue in ERCOT are twice the power demand that Nvidia is going to deliver in the next four years in chips." This makes forecasting difficult because planners, investors and regulators must be flexible in estimating what will happen.
Third is regulating who gets what power in an emergency event. "The bill says if ERCOT declares an emergency, that if it's behind the meter, you have to take a power plant like ours and put all the power back on the grid. And if it's in front of the meter, you have to be willing to disconnect."
"Who pays for what" is another issue, Berry said, believing that consumers will rebel at paying higher rates to help the Amazons of the world build new data centers. That resistance could inhibit the building of more data centers.
The payment question revolves around causality, "meaning if they cause transmission upgrades, they should pay their way," Berry said. And the residential consumer will rebel if they must share the cost.
Among the options for dealing with this issue, Hill noted that one state is choosing a different path: Ohio, where regulators over the summer said any new data center must pay at least 85% of its highest forecasted electricity use for 12 years. That guarantees that the cost of powering the data center does not weigh on other ratepayers.
As with everything, there is no simple answer. But grids across the U.S. and the world are turning their full attention to the issue.
Key Takeaways:
Summary
Experts discussed how costs associated with the heavy energy consumed by data centers should be allocated
Anticipating and managing the U.S.'s growing energy challenges was the subject for the "Infrastructure, Reliability, and Resilience in Electricity Markets" session at the Ninth Annual Energy Summit in Houston, co-hosted by Baker Botts LLC and the Center for Energy Studies at Rice University's Baker Institute.
This article is the fourth and final in a series on the Rice Baker Institute Energy Summit. A previous story reported on the three panelists' views on investment in power infrastructure and generation, and on the load management challenges involved with wind and solar. For more information, see October 23, 2025, article - Energy Summit Explores Infrastructure Reliability, Resilience.
Here, the story is about requiring data centers to provide their own power, and how much of the financial load consumers should bear.
Panelists included moderator Michael Yuffee, Partner, Baker Botts; Thad Hill, CEO, Calpine; Dave Berry, the chief executive officer of Cloverleaf Infrastructure; and Peter Hartley, PhD, who is George A. Peterkin Professor of Economics at Rice University.
Behind the Meter?
Asking large data center builders to add their own power, or "islanded assets," has gotten much press. It sounds good to consumers looking to avoid footing bills for power plants mainly used by these power-hungry centers.
But to the panel, it's "a really bad idea," Berry said, citing costs and reliability issues. It is more expensive to build a stand-alone system, and it offers no contribution to the grid's demand cycles.
"Would I like to be part of a highly networked integrated grid that serves millions of customers and has thousands of resources and is operated by companies, as imperfect as they are, who've been doing it for 100 years? Or would I like to go build my own and hope for the best?"
Hill agreed: "Islanded operations are really, really hard because of the frequency and the voltage requirements that have to be almost perfect on a grid. When you have a big grid, you can balance those things out." So even the resources behind the grid must be fine-tuned constantly to match supply with demand.
Whereas "islanded" assets are totally separate from the grid, connected only to a single end user, "co-located" assets are onsite and the data center is behind the meter. But the power from co-located assets still is available to the grid, creating the best of both worlds.
The Way of the Future?
Hill spoke about Texas's House Bill 6, which he believes has ideas that should spread nationally.
He said HB 6 does three things. "Number one, it says more or less that PUC has to kick off a rate proceeding because in ERCOT, ratepayers pay the transmission. And if you're going to start putting stuff behind the meter or in front of, there's an implication about who pays for that connection to the grid and how that cost gets shared.
"Secondly, the number of data center megawatts that are in the queue in ERCOT are twice the power demand that Nvidia is going to deliver in the next four years in chips." This makes forecasting difficult because planners, investors and regulators must be flexible in estimating what will happen.
Third is regulating who gets what power in an emergency event. "The bill says if ERCOT declares an emergency, that if it's behind the meter, you have to take a power plant like ours and put all the power back on the grid. And if it's in front of the meter, you have to be willing to disconnect."
"Who pays for what" is another issue, Berry said, believing that consumers will rebel at paying higher rates to help the Amazons of the world build new data centers. That resistance could inhibit the building of more data centers.
The payment question revolves around causality, "meaning if they cause transmission upgrades, they should pay their way," Berry said. And the residential consumer will rebel if they must share the cost.
Among the options for dealing with this issue, Hill noted that one state is choosing a different path: Ohio, where regulators over the summer said any new data center must pay at least 85% of its highest forecasted electricity use for 12 years. That guarantees that the cost of powering the data center does not weigh on other ratepayers.
As with everything, there is no simple answer. But grids across the U.S. and the world are turning their full attention to the issue.
Key Takeaways:
- There's no easy answer to addressing data centers' heavy energy consumption
- It's not as simple as making data centers build their own power plants
- Recent actions in Texas and Ohio area might prove to be guiding lights