Released March 17, 2025 | SUGAR LAND
en
                  
                    Written by Paul Wiseman for Industrial Info Resources (Sugar Land, Texas)--Previously in this series, we have examined the rising challenges and some suggested solutions to the rise of power-hungry data centers that host everything from online shopping and bitcoin mining to artificial intelligence (AI). This final entry offers a peek at the future, and whether the same artificial intelligence that's straining the power grid can also contribute to saving it.
For starters, there are two kinds of AI: the traditional type, involving speech recognition, predictive analytics, and others; and generative AI, which generates text and images, but to do that requires massive amounts of energy and hardware to operate much more complex systems of training and inference.
The latter is more recent, and many adopters have approached its adoption cautiously, making sure its benefits outweighed the costs. As AI has proven its ability to more than pay for itself, many expect this more energy-ravenous type to gain market share, which could push the grid and the need for solutions even further.
David Pickering, Industrial Info's vice president of research, industrial manufacturing said, "We are tracking (US) $1.2 trillion globally (all active projects), with $823 billion of that in the U.S. We are also tracking 98 GW (gigawatts) of PEUC (planning, engineering, under construction) capacity in the U.S. alone."
In 2023, Industrial Info was tracking just $35 billion in data center projects in the U.S. So is there any way to mitigate the rise in energy demand? There are several ways, starting with the computer chips themselves.
More efficient chips is one option, and a World Economic Forum (WEF) report from July 2024 quoted chipmaker NVIDIA (NASDAQ:NVDA) (Santa Clara, California) as saying a new superchip it was developing could deliver 30 times faster performance while using 25 times less energy. That is one option.
Because cooling the data center chips is responsible for 40% of a typical data center's cost, many are looking for ways to reduce that load. One involves liquid cooling, covering the data-center equipment with a nonconductive and noncorrosive liquid to absorb and remove the heat. But while these are energy efficient, they are expensive to operate and very difficult to retrofit on an existing facility.
Direct-to-chip cooling is another cooling possibility, which involves putting a cold plate with a specially-engineered dielectric fluid flowing inside it directly in contact with central processing units (CPUs). This has the advantage of cooling only the heat-generating elements, so it can accomplish the cooling faster and with less energy use.
The problems with this are that it, like liquid cooling, is very expensive; heat can build up in uncooled areas, the fluid can leak, and typical fluid formulas are not environmentally friendly.
There is also the issue of "dark data," something that is used only once, but it can hog memory space--and energy use--forever unless deleted. But this would only reduce the need for traditional AI, not the generative version that requires the most power.
How about AI itself? Can its ability to manage massive datasets be used to the good? That WEF report quoted research firm Boston Consulting Group (BCG) as musing that AI assistance could reduce greenhouse gas emissions 5-10% by 2030. And in a paper entitled, "Why AI and energy are the new power couple," released in 2023, the International Energy Agency (IEA) said AI is already helping by improving supply and demand predictions. It can develop data models to reduce the unpredictability of wind power while also developing demand models.
For example, "Google and its AI subsidiary DeepMind developed a neural network in 2019 to increase the accuracy of forecasts for its 700-MW (megawatt) renewable fleet. Based on historical data, the network developed a model to predict future output up to 36 hours in advance with much greater accuracy than was previously possible," said the IEA paper, which added that similar predictive models can be used for the grid as a whole.
Daily there are stories about AI improving oil production, manufacturing, design and some aspect of almost every phase of modern life, leading many to surmise that AI's strain on the grid is more than worth the challenges that presents. But that still leaves grid operators and governments with the challenge of keeping up with the demand growth for the foreseeable future.
Industrial Info Resources (IIR) is the leading provider of industrial market intelligence. Since 1983, IIR has provided comprehensive research, news and analysis on the industrial process, manufacturing and energy related industries. IIR's Global Market Intelligence (GMI) platform helps companies identify and pursue trends across multiple markets with access to real, qualified and validated plant and project opportunities. Across the world, IIR is tracking more than 200,000 current and future projects worth $17.8 trillion (USD).
                For starters, there are two kinds of AI: the traditional type, involving speech recognition, predictive analytics, and others; and generative AI, which generates text and images, but to do that requires massive amounts of energy and hardware to operate much more complex systems of training and inference.
The latter is more recent, and many adopters have approached its adoption cautiously, making sure its benefits outweighed the costs. As AI has proven its ability to more than pay for itself, many expect this more energy-ravenous type to gain market share, which could push the grid and the need for solutions even further.
David Pickering, Industrial Info's vice president of research, industrial manufacturing said, "We are tracking (US) $1.2 trillion globally (all active projects), with $823 billion of that in the U.S. We are also tracking 98 GW (gigawatts) of PEUC (planning, engineering, under construction) capacity in the U.S. alone."
In 2023, Industrial Info was tracking just $35 billion in data center projects in the U.S. So is there any way to mitigate the rise in energy demand? There are several ways, starting with the computer chips themselves.
More efficient chips is one option, and a World Economic Forum (WEF) report from July 2024 quoted chipmaker NVIDIA (NASDAQ:NVDA) (Santa Clara, California) as saying a new superchip it was developing could deliver 30 times faster performance while using 25 times less energy. That is one option.
Because cooling the data center chips is responsible for 40% of a typical data center's cost, many are looking for ways to reduce that load. One involves liquid cooling, covering the data-center equipment with a nonconductive and noncorrosive liquid to absorb and remove the heat. But while these are energy efficient, they are expensive to operate and very difficult to retrofit on an existing facility.
Direct-to-chip cooling is another cooling possibility, which involves putting a cold plate with a specially-engineered dielectric fluid flowing inside it directly in contact with central processing units (CPUs). This has the advantage of cooling only the heat-generating elements, so it can accomplish the cooling faster and with less energy use.
The problems with this are that it, like liquid cooling, is very expensive; heat can build up in uncooled areas, the fluid can leak, and typical fluid formulas are not environmentally friendly.
There is also the issue of "dark data," something that is used only once, but it can hog memory space--and energy use--forever unless deleted. But this would only reduce the need for traditional AI, not the generative version that requires the most power.
How about AI itself? Can its ability to manage massive datasets be used to the good? That WEF report quoted research firm Boston Consulting Group (BCG) as musing that AI assistance could reduce greenhouse gas emissions 5-10% by 2030. And in a paper entitled, "Why AI and energy are the new power couple," released in 2023, the International Energy Agency (IEA) said AI is already helping by improving supply and demand predictions. It can develop data models to reduce the unpredictability of wind power while also developing demand models.
For example, "Google and its AI subsidiary DeepMind developed a neural network in 2019 to increase the accuracy of forecasts for its 700-MW (megawatt) renewable fleet. Based on historical data, the network developed a model to predict future output up to 36 hours in advance with much greater accuracy than was previously possible," said the IEA paper, which added that similar predictive models can be used for the grid as a whole.
Daily there are stories about AI improving oil production, manufacturing, design and some aspect of almost every phase of modern life, leading many to surmise that AI's strain on the grid is more than worth the challenges that presents. But that still leaves grid operators and governments with the challenge of keeping up with the demand growth for the foreseeable future.
Industrial Info Resources (IIR) is the leading provider of industrial market intelligence. Since 1983, IIR has provided comprehensive research, news and analysis on the industrial process, manufacturing and energy related industries. IIR's Global Market Intelligence (GMI) platform helps companies identify and pursue trends across multiple markets with access to real, qualified and validated plant and project opportunities. Across the world, IIR is tracking more than 200,000 current and future projects worth $17.8 trillion (USD).
 
                         
                
                 
        