Skip to main content

Hitachi

Social Innovation

Taking on Generative AI’s Green Energy Dilemma

By Bo Yang, Ph.D., Vice President, Energy Solutions Lab, R&D Division, Hitachi America, Ltd.

The advent of GenAI is part of a compelling narrative of innovation and progress. However, the global proliferation of data centers driven by the advances in the AI industry also puts new demands on grid operation and capacity expansion. Consider the following:

  • The amount of electricity required by AI data center racks is estimated to require seven times more power than traditional data center racks.1
  • Goldman Sachs estimates there will be a 160% increase in demand for power propelled by AI applications by the end of this decade.2
  • At 2.9 watt-hours per ChatGPT request, AI queries are estimated to require ten times the electricity of traditional Google queries, which use about 0.3 watt-hours apiece.3

As companies seek to make their GenAI models larger, they will need to consume even more energy for both training and inference purpose. To meet that demand, the energy industry will need to find solutions that help them cope with these unprecedented challenges.

Challenges of Traditional Energy Resource Planning

This growth trajectory underscores the need for greater collaboration among regulators, utilities, and industry stakeholders to address future energy demands while ensuring that the energy we generate is clean, sustainable and affordable. It will require a framework that transcends the limitations of traditional resource planning methodologies.

Grid investment, driven by energy resource planning, is usually a drawn-out process, one that’s subject to regulatory approval. When power companies map out how to handle expected load increases, they typically add more wire and equipment capacity that are designed to serve customer needs decades into the future.

This regulated approach worked well in previous eras. However it may not address energy equity and affordability effectively for current end-use customers. Modern data centers developed by tech companies are often hyperscale, i.e., energy is concentrated, and integrated into the grid through transmission networks. Grid expansion projects on the transmission grids are more expensive by nature. Based on the traditional resource planning methods, these costs driven by data center loads will be passed down to the end users, including disadvantaged communities.

Better digital decision support tools can help identify cheaper and better non-wire alternatives to avoid or defer the need for transmission line expansion. Current tools used by grid planners can barely help here. One of our customers recounted how their current tool took more than a couple of weeks to run a single round of evaluation. And it typically takes a few rounds of studies to understand how new data centers can impact grids and potential mitigation measures. This will inevitably prolong the permitting time and create delays on grid projects - that’s not a sustainable model of efficiency in this burgeoning era of AI.

New Power Demands

There’s no escaping the fact that GenAI is going to demand more energy, given the trend toward bigger and larger training models. In a recent report, Morgan Stanley Research estimated that GenAI’s power demands will soar 70% annually.4

In the short run, this rapid surge in power demand will stress the grid. The near-term challenge will be to find ways to drive greater energy efficiency. Data centers built by cloud-based and hyperscale operators for training their AI are a bit better when it comes to energy efficiency – they usually only need about 20% more energy to run their operations – but total energy consumption is higher because of the growing computational demands of GenAI.

At the same time, there’s added impetus to become more energy-efficient, given the impact of GenAI on the environment. Even before GenAI, data centers, which mostly rely upon fossil fuels, account for 2.5 to 3.7 percent of global greenhouse gas emissions,5 higher than the aviation industry. It’s estimated that the daily carbon footprint of GPT-3 adds up to roughly the equivalent of 8.4 tons of CO26 in an entire year. Some industry estimates suggest that GPT-4, OpenAI's latest version of its Large Language Model, has 1.76 trillion parameters.7 That would make it ten times bigger than GPT-3 with its 175 billion parameters.8

Innovative Solutions for Resource Planning

Hitachi is at the forefront of integrating renewable energy sources into the grid, developing solutions for energy storage, grid stabilization, and smart grid technologies that accommodate the fluctuating nature of renewable energy sources. These innovations are crucial in meeting the growing energy demands of modern data centers and ensuring a sustainable energy future.

While it’s true that smaller data centers often require significantly more energy beyond just their IT infrastructure needs to support the entirety of their operations, we are revolutionizing energy efficiency in this space. Through cutting-edge innovations, we are not just optimizing energy use but significantly reducing overall power consumption, setting new standards for sustainable data center operations.

Our expertise in energy storage systems also plays a vital role in this effort. By developing and deploying efficient energy storage solutions, we enable the capture and storage of excess renewable energy. This stored energy can be utilized during peak demand periods, ensuring a consistent and reliable power supply. Additionally, our smart grid technologies facilitate the seamless integration of distributed energy resources, enhancing grid resilience and supporting the transition to a more sustainable energy infrastructure.

Hitachi America, Ltd. R&D also collaborated with Hitachi Energy, the California Energy Commission, and Stanford's National Accelerator Lab, a federally funded R&D facility, to design a digital tool called GLOW to answer that need. This digital platform simulates the interplay of the components of an electrical distribution system, presenting this complex data in an easy-to-understand, highly visual format. GLOW’s intuitive user interface allows users – without needing extensive training – to understand how to plan for and manage new distributed energy sources. With this cloud-based decision support platform, GLOW can make it easy for grid planners and other stakeholders to evaluate the impacts of increasing data center loads on distribution grids, beneficial mitigation measures, and affordable rate tariffs for end-users, especially disadvantaged communities.

Different stakeholders can now cross organizational boundaries and collaboratively make data-backed decisions together. This fosters more effective collaboration, allowing data to be exchanged safely among different organizations. The power companies, who do most of the heavy lifting in this process, can then gather the necessary data they require in support of more effective decision-making.

This can’t happen soon enough.

Hitachi can help

Bo Yang, Ph.D., Vice President, Energy Solutions Lab, R&D Division, Hitachi America Ltd.

Bo Yang, Ph.D.
Vice President, Energy Solutions Lab, R&D Division, Hitachi America Ltd.

Bo Yang leads the energy solution team in Hitachi America and pioneers in adopting AI/ML techniques for energy applications. Her team has developed several innovative AI and IoT platforms. Bo also represents Hitachi in projects funded by federal and state agencies. She has extensive academic and professional experiences on Distribution Energy Resource integration and control, Distribution Automation, Smart Grid, AI/ML and enterprise system architecture.

Bo received her Ph.D. in Electrical Engineering from Arizona State University. She is a Fellow of the Institute of Engineering and Technology (IET).

linkedin

1
AI and data center growth equal power demand, American Nuclear Society, https://www.ans.org/news/article-5872/ai-and-data-center-growth-equal-power-demand/
2
AI/data centers' global power surge and the Sustainability impact, The Goldman Sachs Group, Inc., https://www.goldmansachs.com/insights/articles/AI-poised-to-drive-160-increase-in-power-demand
3
Powering Intelligence Analyzing Artificial Intelligence and Data Center Energy Consumption (EPRI), https://www.wpr.org/wp-content/uploads/2024/06/3002028905_Powering-Intelligence_-Analyzing-Artificial-Intelligence-and-Data-Center-Energy-Consumption.pdf
4
Powering the AI Revolution, Morgan Stanley, https://www.morganstanley.com/ideas/ai-energy-demand-infrastructure
5
Carbon Footprint of Data Centers & Data Storage Per Country (Calculator), 8 Billion Trees, https://8billiontrees.com/carbon-offsets-credits/carbon-ecological-footprint-calculators/carbon-footprint-of-data-centers/#:~:text=Data%20centers%20account%20for%202.5,that%20fuel%20the%20global%20economy
6
AI’s Growing Carbon Footprint, Columbia Climate School, https://news.climate.columbia.edu/2023/06/09/ais-growing-carbon-footprint/
7
The Decoder, GPT-4 has more than a trillion parameters - Report, https://the-decoder.com/gpt-4-has-a-trillion-parameters/#:~:text=Further%20details%20on%20GPT-4's,Mixture%20of%20Experts%20(MoE).
8
AI’s Growing Carbon Footprint, Columbia Climate School, https://news.climate.columbia.edu/2023/06/09/ais-growing-carbon-footprint/