In 2024, the San Francisco Fed and the Federal Reserve System Innovation Office launched the EmergingTech Economic Research Network (EERN) to support a better understanding of how new technologies like GenAI are shaping the economies of today and the future. As part of the EERN initiative, we often hold roundtable discussions to hear from industries and sectors at the front lines of digital adoption to ask questions such as: How are artificial intelligence (AI) tools being used? Is AI adoption still in the early stages or has it been fully integrated in certain areas? What are the expected or realized impacts on firm productivity and employment?
To address these and other questions about AI’s impact on the energy sector, we recently convened a roundtable of executives within the Twelfth District. These industry leaders from across the energy ecosystem gathered for a discussion on AI’s impact on energy demand and associated infrastructure challenges. This group of executives—representing utilities, digital infrastructure providers, semiconductor manufacturing, and AI computing—shared valuable perspectives on the transforming energy landscape.
The executives met with Mary C. Daly, President and CEO of the San Francisco Fed, Sunayna Tuteja, Senior Vice President and System Chief Innovation Officer for the Federal Reserve System, Louise Willard, Executive Vice President and San Francisco Fed Chief Information Officer, and Kevin Ortiz, San Francisco Fed Deputy Chief of Staff and co-head of EERN.
AI’s Transformative Impact on Energy Demand
The roundtable began with a discussion on AI’s early adoption patterns by organizations across geographies, industries, and sizes, and the corresponding need for data center infrastructure to meet this growing demand. With data center scaling, however, comes both opportunities and challenges arising from energy availability.
Participants characterized AI and data center scaling as driving one of the most substantial growth opportunities for the electrical utility sector. While acknowledging that some interest may be speculative, roundtable members referenced current projections suggesting data centers use approximately 3% of available energy today and could reach 8% by 2030. They noted that some regions are already experiencing this trajectory, with usage rising from less than 1% to projected 5% by 2027.
A utility executive noted that data centers have a relatively flat energy consumption pattern—using consistent amounts of energy, unlike residential or business customers whose usage fluctuates. Data centers also require 24/7 electricity availability, which they shared is causing electric utilities to rethink energy sources and infrastructure needed to meet growing demand.
Another roundtable participant outlined how we are experiencing the third wave of AI development: first came perceptive AI, then generative AI, and now reasoning and agentic AI. They projected a future that promises a fourth wave: physical AI applications like robotics at scale and self-driving cars, meaning every industry will eventually need its own “intelligence factory,” driving energy demand even higher.
The roundtable member noted that a rise of physical AI could potentially push growth in energy needs beyond 8% to somewhere between 10-15% by 2030.
Infrastructure Challenges
Industry participants observed that utilities normally experience about 1% load growth annually, with some even seeing negative growth due to energy efficiencies. However, they shared that some are now projecting 8% annual growth—essentially 40 years of typical growth compressed into just 5 years.
According to the roundtable participants, this acceleration creates multiple infrastructure challenges:
- Supply Chain Bottlenecks: Critical equipment like substations, transmission resources, and transformers face significant delays.
- Generation Capacity: Natural gas turbines, nuclear power plants, and other possible energy sources can take many years to build, creating a mismatch between immediate demand and available supply.
- Permitting Delays: Regulatory processes for new infrastructure can be lengthy.
- Cost Allocation: The question of who pays for infrastructure upgrades is uncertain. Some mentioned that certain jurisdictions mandate that data centers pay their own way, especially when infrastructure costs are primarily for their benefit.
- Workforce Gaps: The shortage of electricians and technicians is becoming acute as demand for AI infrastructure grows. One participant stated: “We don’t have enough skilled workers to build these AI centers fast enough,” making workforce training a critical long-term issue.
Potential Solutions
The industry leaders discussed several approaches to address these challenges:
- Nuclear Power: While some saw nuclear energy as promising, particularly small modular reactor (SMR) technology, they acknowledged significant uncertainty around costs and timelines. One participant highlighted thermal spectrum molten salt breeder reactors (TS-MSBRs) using Thorium as a particularly promising option being investigated, with potential systems co-located with data centers. The participant noted this would eliminate transmission losses and grid dependency while providing the 24/7 base-load power AI operations require.
- Energy Efficiency: A storage provider highlighted their focus on reducing energy consumption, with recent products achieving 40% greater energy efficiency. However, another participant cautioned that “all efficiency gains slow the price increases, but don’t offset surging demand.” They noted that GPU (graphics processing unit) technologies are improving at least 4-5 times every year, contributing to efficiency gains.
- Advanced Cooling: Experts mentioned that liquid cooling systems using materials like ethylene glycol (a refrigerant also used in cars) can help improve power utilization efficacy in data centers.
- Collaborative Approaches: One executive shared an example of success when three regional utilities collaborated to bring new transmission capacity within 5 years.
- Utilizing Existing Resources: A digital infrastructure representative suggested tapping into backup power sources that aren’t being fully utilized while new energy technologies develop.
- Vertical Integration: A digital infrastructure expert noted that projects are increasingly “going vertical” with companies building “gigawatt campuses” that make more efficient use of the same acreage, allowing for greater energy density and potentially more efficient resource utilization.
Looking Forward
The industry experts concluded by identifying the biggest constraints to innovation: energy costs, supply chain limitations, lengthy permitting processes, and regulatory hurdles. Participants emphasized the need for an energy strategy to bridge near-term demands with longer-term solutions. They noted this includes the deployment and expansion of existing energy sources and optimizing existing grid infrastructure to handle sustained AI loads rather than just peak cycles, while also preparing for longer-term developments in nuclear and fuel cycle maturity.
The roundtable made clear that while the AI revolution presents extraordinary opportunities, it also creates unprecedented demands on energy infrastructure. As one participant projected, with demand for an additional 100 gigawatts in the United States by 2030, the pressure to develop solutions quickly will only intensify.
The views expressed here do not necessarily reflect the views of the management of the Federal Reserve Bank of San Francisco or of the Board of Governors of the Federal Reserve System.



