Nick Loris, Senior Policy Fellow, National Taxpayers Union
With Contributions from:
Travis Fisher, Director of Energy and Environmental Policy Studies, Cato Institute
Josh Smith, Senior Fellow, Pacific Legal Foundation
Dr. Katherine Wright, Hillsdale College
and Pete Sepp, President, National Taxpayers Union
Executive Summary
I. Why Are Electricity Prices Going Up?
- Rising U.S. electricity rates, driven by several complex interconnected factors including inefficient capital expenditure and flawed public policy, have increased perceived energy insecurity for households, even if actual energy insecurity is not as common.
- While large-scale data center demand would seem to strain the grid, these developments can reduce residential rates by spreading fixed utility costs across a larger consumer base, and underwrite significant upgrades and investment to the shared system that benefit other users.
- To maximize benefits, policies should ensure an accurate, neutral, and fair analysis toward data center developers paying for necessary infrastructure.
II. The Need for Permitting Reform
- The current permitting system causes significant delays and cancellations, resulting in higher energy costs and reduced reliability for consumers.
- Effective reform requires comprehensive, bipartisan action to streamline reviews and reform the abuse of legal challenges to block projects.
- Certainty must protect the viability of projects before, during, and after the permitting process.
- Reform must respond to the urgency of accelerated planning and deployment of modernized transmission infrastructure to meet load growth, and with it economic and national security imperatives.
III. Increasing Competition and Aligning Incentives
- State policy should prioritize wholesale and retail competition over monopoly structures to enhance grid reliability and affordability.
- States should pursue reforms to address regulatory bottlenecks, eliminate costly market distortions, and increase flexibility.
- Coordinated load forecasting can reduce the risk of costly overbuilds, prevent reliability shortfalls, and give investors the confidence to deploy capital where it is actually needed.
IV. Unlocking Flexibility: Grid Centers as a Data Resource
- Data centers can employ a diverse portfolio of operational and technical strategies to provide valuable services to the electric grid.
- There are significant technical and regulatory obstacles to unlocking the full potential of large-load flexibility.
- In the face of significant expansion of demand, flexibility options are an important tool in the toolbox, but not a silver bullet.
V. Thinking Outside the Grid: The Consumer Regulated Electricity Alternative
- Policymakers should come to terms with this stark reality: meeting the demands of the data center era requires thinking outside the legacy grid.
- Consumer Regulated Electricity would allow privately financed, physically islanded electric utilities to serve new, voluntary customers such as data centers or industrial facilities.
- Instead of competing through subsidies or narrow tax provisions, states can allow private infrastructure to form freely, letting developers finance, build, and operate their own islanded power systems without drawing on public funds.
VI. AI, Water Rights, and an Opportunity for Property Rights
- Building data centers in water-scarce areas may raise public concern, but there is no legal mechanism for diverting water without first obtaining a water right.
- The price of water in the West reflects the cost of delivery, not the scarcity of water. Stronger price signals would better incentivize consumers to reduce water use.
- By focusing on lowering the cost of water transfer, politicians can avoid writing policy carveouts that might be specific to a time and place but become irrelevant in the future.
Foreword
Pete Sepp
President, National Taxpayers Union
Should taxpayers love or loathe data centers? This question seems a loaded one in the first place. To ask it is to present a binary choice when, in reality, public opinion on the matter is more complex. Initially, a handful of fiscal conservatives branded the facilities that provide the infrastructure of the next Information Age as little better than taxpayer-funded sports stadiums or convention centers, which are proven economic and fiscal losers for the communities in which they abide compared to the various government subsidies they receive.1 Yet, aside from the fact that data centers, sports stadiums, and convention centers are physical structures, they have little to nothing in common.
Sports stadiums and convention centers have always been flashy monuments to “team spirit” or “civic pride,” which only some thousands of fans or attendees may occupy. Data centers are houses to the internet-based economy and society that serve hundreds of millions of Americans, as well as billions of others worldwide. The few local jobs that are created from sporting events and conventions, primarily low-paying service-sector positions, often come at the expense of established small businesses that are displaced by the new construction. Although measurements for data center jobs vary, and are often debated among reasonable economists, the trend is robust and positive.2 Data centers entail not only well-paying construction jobs, but support in-person and remote positions all over the country and throughout the supply chain. Stadiums and convention centers entail massive road building, sewer connections, land giveaways, and other concessions that cost taxpayers dearly, while, for the most part, data centers demand less from local governments. The net government revenue share from stadiums and convention centers is generally offset by these outlays, whereas many data centers can boast of lifting the tax bases where they are located to healthy levels. And, as this paper will show, states with large electricity load-growth in recent years have actually seen a reduction in total all-sector electricity prices.
Why then, does public suspicion of data centers persist? One answer is that the “persistence” is not all that deep. The more people understand data centers, the less they tend to reflexively oppose them. When data centers are demystified and put in the context of their role in the overall economy as well as their potential for the local economy, large majorities feel more comfortable with these centers near their own neighborhoods.3 But polling is just one aspect of how the public reacts to data centers. Some states have begun reevaluating tax laws toward data centers as if public officials’ initial policies were somehow “giveaways.”4 In many cases, however, their original instincts were correct, and they were merely extending commonsense provisions that already apply to a wide range of businesses.5 Within a given state’s boundaries, the reaction of communities can be as different as night and day. In Missouri, for example, various communities have either slowed the development of data centers or encouraged them through local policies.6
But the debate over data center development has escalated all the way up to the federal level as well. In his State of the Union address, President Trump stated that his Administration had been seeking commitments from technology companies to “pay their own way” in data center development. Whether the implication—that somehow such companies weren’t doing so prior to his speech—is true or not, subsequent events have shown the tech sector’s willingness to have a much more substantive fiscal policy conversation over data centers than sports team owners or convention promoters ever demonstrated over their respective projects. Companies such as Google, Amazon, and Meta have committed to the Administration’s five-part “Ratepayer Protection Pledge” that stresses building, connecting, or purchasing new power, underwriting infrastructure improvements, negotiating no-strings-attached electricity agreements with governments, investing in workforces, and making resilient grids for everyone a priority.7
A future series of NTU papers will discuss in-depth the tax policies that can support the AI revolution and the data centers underlying it, without resorting to narrowly-crafted deductions, exemptions, and outright subsidies that have characterized public officials’ responses to previous economic trends. Clearly, however, some principles are worth emulating from the outset:
- States should conform their corporate and small business tax treatment of investments to the federal model. The July 2025 federal tax law, which made full and immediate expensing as well as R&D cost write-offs permanent, along with new expensing provisions for structures, should be mirrored at the state level. The 20-plus states that don’t provide for this same treatment in their laws should make this easy change, which treats all business activity, including data centers, the same.8
- Existing, sensible state and local tax exemptions for business-to-business transactions and inputs can and should be expanded to data center development, which has often been described as analogous to putting up an office complex, warehouse, or industrial park.
- As data centers dramatically increase the business sector’s contributions to local property tax bases, care should be taken to ensure that local governments steward, rather than squander, the revenue windfalls they receive. NTU-backed taxpayer protections, such as revenue growth caps that can only be exceeded with voter approval, and transparency reforms such as Truth in Taxation laws, are essential elements of this new environment.9
This paper, however, focuses more on the energy and water policy aspects surrounding data centers, which, by themselves, require a great deal of thoughtful deliberation on the part of federal, state, and local officials. Responses have already arisen from voices on the Left and the Right, calling for various restrictions on data centers.10 Motivated by lack of understanding about how the future Internet will work for all of us, or false narratives about data centers and energy prices, or hidden wealth-redistribution agendas, or even simple suspicion of anything “new,” the proposals proffered from politicians across the spectrum to take America out of the AI race are not only dangerous, they could be deadly. Losing our technological edge to foreign competitors could mean everything from missing out on lifesaving health care innovations to falling behind in cost-effective national defense systems.
The biggest, and most concerning, among these competitors is the People’s Republic of China. Although the U.S. and the rest of the free world continues to outpace China in terms of investment in AI,11 China is surging dramatically in the everyday use of AI and development of applications based on AI. National and economic security deserves federal leadership and guidance, but also needs the uniquely American “bottom-up” approach that has always characterized our private-sector-driven economy. Those qualities are already manifesting themselves in freedom-based policy solutions outlined here that will allow all of us—community by community, idea by idea, innovator by innovator—to be winners in the next human age of discovery.
As the following sections of this paper suggest, there are better responses to how data centers evolve than backward-looking bans or “crackdowns.” As I write this, permitting reform, consumer-regulated electricity, privately-financed electricity generation, advanced transmission technologies, and new approaches to water rights are all constructive answers to legitimate questions about how the data center-driven AI revolution will fit in our economy going forward.
With this paper, a host of experts gathered here aim to provide public officials and the taxpayers they serve with the beginnings of a toolkit that can construct, maintain, and occasionally repair data center policy for the here and now as well as the future. The ability of Americans, imbued with an entrepreneurial spirit in a tax, budgetary, and regulatory climate that allows them to thrive, has always been our greatest national asset. Let us invest in this asset, and allow the dividends to grow for this generation and those to come!
Introduction
The global economy is in the early stages of a profound energy boom, driven by the immense computational demands of artificial intelligence (AI) and the rapid expansion of data centers. We are merely scratching the surface of how AI can make our lives easier, healthier, and happier. By accelerating breakthroughs in medicine, energy, transportation, and scientific discovery, AI has the potential to make the world a more prosperous and cleaner place in unimaginable ways.
Yet, the public support for data centers is decidedly mixed. A recent POLITICO poll found that more voters (37%) would support building a data center within three miles of their home than oppose it (28%).12 Another 28% neither support nor oppose. Unsurprisingly, jobs were the biggest reason for support, and rising electricity costs were the biggest reason for opposition. Concerns about water use have been increasingly raised. Politicians at the federal and state levels are increasingly hearing from constituents over cost concerns, prompting legislatures to hold hearings and introduce legislation to protect ratepayers, including calls for data center moratoriums. Other states have proposed tax incentives to attract data centers.
There is another policy path forward that protects American taxpayers and ratepayers while enabling economic growth and increasing energy supply to meet rising demand: relieving regulatory bottlenecks that stifle energy infrastructure development and providing the certainty and flexibility for innovative technologies to help meet America’s energy needs. Tax laws that properly treat investment can likewise support innovation, whatever direction it may take. Policies that are sector- and company-neutral, and permit markets to function efficiently will best provide the affordable, dependable power that American families rely on and the U.S. economy runs on.
Section I. Why Are Electricity Prices Going Up?
Key Points
- Rising U.S. electricity rates, driven by several complex interconnected factors including inefficient capital expenditure and flawed public policy, have increased perceived energy insecurity for households, even if actual energy insecurity is not as common.
- While large-scale data center demand would seem to strain the grid, these developments can reduce residential rates by spreading fixed utility costs across a larger consumer base.
- To maximize benefits, policies should ensure an accurate, neutral, and fair analysis toward data center developers paying for necessary infrastructure.
In many regions of the country, ratepayers are seeing sustained increases that outpace wage growth and add to broader cost‑of‑living pressures. Higher electric bills have escalated the number of households facing overdue bills and power shutoffs.13 An estimated 14 million Americans had utility debt reported to collection agencies last year, and the average utility debt rose to nearly $800, an all-time high.14 The U.S. Energy Information Administration data indicate that, in 2020, approximately one quarter of households experienced energy insecurity.15 Energy insecurity is characterized by considering reducing or foregoing essential needs, such as groceries or medical care, or by maintaining residences at uncomfortable or unsafe temperatures, or by receiving a utility disconnection notice.
The rate of increase in electricity prices varies by region, and the reasons for their escalation are numerous, complex, and sometimes difficult to isolate. Supply and demand affect wholesale and retail prices, albeit not always in the ways shifts and movements in supply-and-demand curves would suggest. Exploring how federal and state policies influence both supply and demand will inform what reforms are necessary to unlock a more resilient, responsive grid.
A Look at Demand
For the last two decades, electricity demand has been relatively flat, and efficiency improvements have kept prices in check.16 In recent years, the rapid expansion of electricity-intensive hyperscale data centers has significantly increased electricity demand. Growing electrification of the economy, which federal and state governments have nudged consumers and businesses toward through subsidies and regulations, has also boosted power consumption. S&P Global forecasts show electricity use increasing by roughly 5.7% per year over the next five years, driven in part by large industrial and commercial loads.17 The same forecast estimates that total electricity consumption will increase by 32% by 2030.18
The scale of this new demand could be staggering. In Texas’s market alone, requests from large consumers to connect to the grid totaled 226 gigawatts (GW) by the end of 2025, more than triple the 63 GW of requests in 2024. Of those requests, 7 out of every 10 are for data centers. Much of the Texas load is speculative and will not materialize. Even a more circumspect review still suggests that about 160 GW of new load is coming to the U.S.19
In some instances, demand from industrial and commercial sectors has put downward pressure on inflation-adjusted retail electricity rates. When more consumers pay for the grid’s fixed costs, those costs are spread across a larger rate base, which can put downward pressure on inflation-adjusted prices. As a recent Lawrence Berkeley National Laboratory study notes, from 2019–2025, the states with the largest load growth saw price declines.20 However, the report emphasizes that it will not always exist. If demand outpaces constrained supply, forces greater capital expenditures, or causes industrial users not to bear the cost of transmission or local distribution, these factors will put upward pressure on prices.
The accuracy of load growth forecasts will be essential to protect ratepayers. Utilities have historically over-forecast load growth, and, even with expected electricity growth, Grid Strategies projects that the share of the overestimation attributed to large cloud providers, also called hyperscalers, could reach 25 gigawatts, enough power for 20 million Americans or the use of several New York Cities. The report emphasizes that “utility forecast practices need improvement to reflect better the probability of projects completing, their total loads, supply constraints, or timing of load growth.”21 While under-forecasting risks failing to have the power needed for data center expansion, over-forecasting risks saddling ratepayers with an overbuilt system.
A Look at Supply
Economics 101 suggests that increasing generation and distribution would supply more electrons to the grid, helping lower prices. However, current rules and regulations have pushed rate-based growth that inflates bills without meaningfully improving grid capacity. The current regulatory system often rewards inefficient spending and has kinked the hose in several spots for more efficient, cost-effective generation and transmission build. The result has been higher rates but a failure to address current supply constraints.
Because regulated utilities operate under traditional cost‑of‑service regulation, they recover costs plus a regulated return. Therefore, they have a “spend money, make money” business model.22 When regulators approve large rate‑base investments, such as system upgrades, new poles and wires, meeting renewable energy targets, and hardening the grid, ratepayers foot the bill. While spending is necessary to ensure grid reliability and meet rising demand, policy and regulations should ensure reliability at the lowest possible cost, which has been far from the case. Furthermore, policy and regulations often disincentivize the most efficient investments or cost-saving upgrades.
One of the most important drivers of higher rates in recent years is the surge in utility capital expenditures, particularly for transmission and distribution. Transmission delivers power from power plants to local substations via high-voltage transmission lines, and distribution is the local network of poles and wires that delivers electricity to homes and businesses through lower-voltage lines. An October 2025 study by Lawrence Berkeley National Laboratory (LBNL) and the Brattle Group analyzed LBNL’s work on retail electricity price and cost trends over the past five years. The study notes substantial growth in capital expenditures for transmission and distribution replacement, construction, upgrades, and resilience.23
Notably, hardening and resilience expenditures are more expensive in some regions than others. California and other Western utilities have spent tens of billions of dollars on wildfire mitigation and insurance and passed those costs onto ratepayers. California’s three largest investor-owned utilities alone spent $27 billion on wildfire mitigation from 2019 to 2023.24
While policy and regulatory changes will help instill more cost discipline for utilities, the takeaway is not “don’t build,” but build the necessary infrastructure that ensures reliability at the least cost and ensure that those causing new costs are also paying them and carrying the related risks.25
The Influence of Public Policy on Supply and Demand
One cannot ignore the role of public policy in distorting investment choices and constraining supply. Utilities cannot build new generation or connect it to the grid fast enough to meet growing demand. Lengthy permitting processes, interconnection backlogs, and siting battles constrain supply, forcing regions to rely on older, less efficient plants and higher-priced power during peak periods.
These challenges also lead to inefficient builds. For instance, instead of planning and permitting long-distance, high-voltage transmission that can move large amounts of power efficiently, the system often defaults to smaller, piecemeal transmission and distribution upgrades that are easier to approve but far less effective. These incremental projects raise costs, do little to reduce or improve congestion, and lock in inefficiencies because they do not meaningfully expand the grid’s capacity or resilience.
Many federal, state, and local regulations affect generation, transmission, and distribution, with compounding effects that increase wholesale and retail prices. Notable policies that inflate the cost of electricity include:
- Siting and permitting delays. Permitting timelines for generation and transmission projects routinely stretch for years, and major projects can take a decade or longer.26 Federal, state, and local permitting challenges compound the problem, and projects face increasingly long interconnection queues to connect to the grid.27 Often, projects are held up in court for years by excessive litigation.28 In fact, the mere threat of litigation increases timelines because risk-averse agencies want to guard against lawsuits. Delays increase costs directly through financing and construction inflation and indirectly by prolonging transmission congestion and limiting access to low‑cost generation or more cost-efficient transmission. The National Petroleum Council’s December 2025 report outlined how permitting delays and project cancellations translate into increased consumer energy costs, diminished reliability, and escalated project costs due to protracted timelines and excessive litigation.29 For instance, the report notes that U.S. natural gas demand has surged by roughly 50% over the last decade-plus, while pipeline capacity has grown only about half as fast.30 A recent McKinsey analysis estimated that, as of July 2025, $1.1 to $1.5 trillion in capital investment was tied up in federal permitting processes.31 The longer that capital is stuck in regulatory limbo, the longer it takes for companies to deploy affordable, dependable energy.
- Interconnection queues. Today’s interconnection queues represent one of the most significant barriers to entry in the electricity sector. When new resources cannot connect to the grid promptly, consumers bear the cost of government-imposed scarcity. Interconnection queues exist because utilities and grid operators require that projects undergo several studies before they are built. The queue is intended to ensure safety, analyze costs, and protect the grid, but the current process has become an increasingly significant bottleneck to getting more electricity onto the grid. For projects built between 2000 and 2010, the average wait time was 2.1 years. That wait time increased to 3.7 years between 2011 and 2021.32 Except in Texas, the queue has risen to an average of over five years. Many projects wait years for approval, and a substantial share ultimately withdraw due to escalating costs and uncertainty. In 2024, for instance, 700 gigawatts (GW) of capacity withdrew, while 500 GW submitted new requests.33 That year, 2,290 GW of capacity sought interconnection, with natural gas capacity in the queue increasing by 72%.34
- Power plant regulations. Several factors, including environmental regulations, dictate utilities’ integrated resource planning (IRP) process for power plant investments. The Obama and Biden Administrations’ power plant regulations contributed to premature power plant closures.35 Another component of the power plant regulations was the mandate of carbon capture technology. In promulgating the rule, regulators argued that forcing carbon capture and sequestration on the electric sector would decrease costs; however, this contradicts conventional economic wisdom, which holds that a regulation imposing higher capital costs on the industry will raise overall electricity costs. Yet another problematic power plant regulation is New Source Review, which disincentivizes efficiency upgrades of existing power plants because it would trigger years-long environmental reviews and open the plants up to increased litigation risk.36
- Policy mandates from states. Technology‑specific requirements for power generation force procurement of higher‑cost resources or on rigid timelines, rather than enabling competition among all reliable, low‑cost options. LBNL’s research on the drivers of recent retail price trends emphasizes that policy choices and state‑specific conditions help explain divergent outcomes. Furthermore, net metering policies shift costs to consumers who do not participate. In effect, net metering policies force utilities to credit rooftop solar customers at the full retail rate, without requiring those consumers to pay for transmission and distribution costs. As a result, the grid’s fixed costs are shifted onto non-solar customers. The price impact of net metering policies varies and can be relatively small (less than 1% of the bill), but they have been particularly expensive in California and Hawaii and serve as a cautionary tale.37
- Tariffs. Whether it is for generation, transmission, or local distribution, the electricity grid is very materials-intensive and equipment-heavy. That includes a great deal of steel and aluminum for natural gas and nuclear plants, steam generators, high-voltage towers, poles, racking, and hardware, as well as globally-traded components such as transformers, switchgear, and power electronics. Tariffs are, in effect, a tax on imports that businesses pass through to consumers as higher prices for goods. While overall inflationary pressures have increased grid infrastructure costs, tariffs on steel, aluminum, and solar panels further increase project costs. Even small percentage increases in price tariffs can translate into significantly higher costs across multi-billion-dollar transmission and generation projects.38 Although it is dependent on the tariff rate and duration, a PwC analysis estimates that the tariffs on the resources, energy, and utilities industries could increase from $400 million to $53 billion annually.39 According to the U.S. Federal Reserve Bank of New York, roughly 90% of tariffs were passed onto consumers and U.S. businesses in 2025.40
Stifling Opportunities for Innovation and Efficiency
Too often, the status quo benefits entrenched interests at the expense of new market participants. Promising, innovative technologies that can deliver power faster, cleaner, and more affordably frequently incur disproportionately higher regulatory hurdles.
One way to cost-effectively increase energy supply is to get more out of the existing system. For instance, natural gas and nuclear power producers have improved efficiency and reduced time spent offline for refueling, thereby increasing energy output. In addition to boosting power plant efficiency, there are ample opportunities to make more efficient use of transmission infrastructure. Grid-enhancing technologies (GETs), advanced transmission technologies (ATTs), and high-performance conductors (HPCs) offer potential cost-effective ways to expand transfer capability, reduce congestion, and improve reliability without the need for new lines.
- High-performance conductors (HPCs) are wires made with advanced conductor materials (such as composite-core or advanced alloys), which can carry more current and operate at higher temperatures than traditional steel-core aluminum wires. HPCs enable existing power lines to carry significantly more electricity with less sag and lower losses. Reconductoring existing rights-of-way can significantly increase line capacity without new permitting battles.
- Dynamic line ratings allow grid operators to safely move more electricity over existing transmission lines by adjusting capacity in real time based on actual weather and operating conditions, rather than relying on overly conservative static assumptions. Dynamic ratings alone can increase line capacity by 10% to 40% under favorable conditions, depending on geography and weather patterns.41
- Power flow control devices use advanced electronics to direct electricity along less-congested pathways on the grid, reducing bottlenecks and improving reliability.
- Topology optimization is a software-driven approach that reconfigures how power flows across the transmission network by strategically opening and closing existing switches and circuits.
- High Voltage Direct Current (HVDC) is especially suited for connecting remote, fixed power sources such as hydro and offshore wind, because it is more efficient in transmitting energy over long distances than traditional alternating current methods. The U.S. has tended to lag other industrialized peer countries in deployment of this technology, the reasons for which include poor interregional planning.42
Despite their promise, GETs, ATTs, and HPCs remain underutilized due to misaligned utility incentives, outdated planning assumptions, and regulatory uncertainty. While these technologies are not a full substitute for building new lines, traditional cost-of-service regulation, planning biases, and siting and permitting challenges result in underutilization.
Antiquated rules and regulations can also stunt the development and deployment of innovative generation technologies. While solar, storage, wind, and natural gas represent the bulk of the resource capacity in interconnection queues, promising new technologies could help meet the economy’s energy demands at speed and scale. A few notable examples include:
- Small modular reactors are advanced light-water nuclear plants designed to be built in factories and deployed in smaller increments, offering dependable, emissions-free power with enhanced safety features and lower upfront capital costs than traditional large light-water reactors.
- Advanced (Generation III+ and IV) nuclear reactors are non-light water reactor technologies with passive safety systems, higher fuel efficiency, and greater flexibility.
- Enhanced geothermal systems use advanced drilling and subsurface engineering to access heat deep underground, providing dispatchable power that can scale beyond traditional geothermal, which has been constrained by geography.
- Distributed energy resources (DERs) are decentralized assets like batteries, demand response, rooftop solar, or backup generators that produce, store, or manage power locally.
- Long-duration energy storage, such as flow batteries, compressed air, or thermal storage, stores electricity for many hours or days, helping the grid manage variability and extend reliability beyond what short-duration lithium-ion batteries can provide.
No silver bullet exists for meeting rising electricity demand. Some regions may expand solar and storage, while others may build a high-efficiency natural gas plant. In other instances, hyperscalers could finance the construction and pay for the electricity from an advanced reactor, as Amazon has done with X-energy43 and Google with Kairos Power and the Tennessee Valley Authority.44 The most promising energy future is one that lets technologies compete and removes policy barriers so that proven and emerging resources alike can compete for customers.
Threats to Reliability
Rapidly rising energy bills are not the only concern for American households and businesses. With demand projected to outpace supply for the next several years, grid reliability could also be a serious challenge. A recent assessment from the North American Electric Reliability Corporation (NERC), a non-profit regulatory authority that works with policymakers in North America to ensure the reliability and security of the bulk power system, is instructive.
In effect, NERC is the referee that helps keep the lights on across the United States, Canada, and part of Mexico. Its job is to help set and enforce reliability standards for the grid, ensuring that power plants, transmission lines, and grid operators plan for extreme weather, cyber threats, equipment failures, and growing electricity demand. As a technology-neutral grid watchdog of sorts, NERC identifies risks, sets commonsense rules, and holds operators accountable so that electricity remains reliable and resilient.
Recent warnings and reports from NERC should be a resounding wake-up call for policymakers to act. In a reliability conference held by the Federal Energy Regulatory Commission last fall, NERC President and CEO Jim Robb said, “The reliability of the power grid remains extremely high, but, paradoxically, the risks to reliability continue to mount. We’re seeing an increasing number of small-scale events and near misses that continue to reinforce what we can’t call anything but a five-alarm fire when it comes to reliability (emphasis added).”45
NERC’s January 2026 long-term reliability assessment includes:
- Of the 23 regions assessed, 13 face resource adequacy challenges over the next decade, with many facing them over the next five years.
- By 2030, NERC identified five areas as high risk. These areas include grids in the Mid-Atlantic, the Midwest, Texas, and the Northwest (Midcontinent Independent System Operator, or MISO, PJM Interconnection, or PJM, the Electric Reliability Council of Texas, or ERCOT, and the Basin and Northwest subregions of the Western Electricity Coordinating Council, or WECC).
- In addition to parts of Canada, NERC identified the Northeast, the Carolinas, and the Great Plains as areas of elevated risk over the next five years.
- In certain regions (the Pacific Northwest, the Mountain West, and the Great Basin states), overreliance on weather-dependent resources could strain grid reliability, particularly in winter months. Backup generation, long-duration battery storage, and grid-forming inverter technologies will help ensure resource adequacy.
To be sure, increased electricity rates and reliability risks are real concerns. However, it is important to underscore that NERC’s assessment and its aggregation of peak load growth forecasts across regions could very well be off-base. Even so, reliability concerns should not be misconstrued as a reason not to embrace economic growth and should light a fire under policymakers to fix a broken regulatory and permitting system.
While that fire is kindled, private companies are already launching initiatives of their own in conjunction with public officials. In March 2026, Google, Amazon, Meta, and several other prominent actors in the tech sector signed onto the White House’s “Ratepayer Protection Pledge” designed to provide some assurance that data center development would proceed with a high degree of transparency and financial accountability. Google’s approach is one vivid illustration of how evolving company practices can, effectively, be “hyperscaled.” The five elements of the company’s Capacity Commitment Framework (CCF), first promulgated in early 2025, have been put into practice in nine states (and counting) across the country:
1. The rules surrounding grid impact should be based on power consumption amounts for all industries; “all customers requesting power supply above a certain amount” should be treated the same from a policy standpoint “because of the significant infrastructure required.”
2. Energy infrastructure, like roads, bridges, and waterways, involves long-term investments; contracts should be drafted with the aim of “providing sufficient revenue to cover the utility’s investments made on their behalf.”
3. Guaranteed minimum payments from hyperscalers for infrastructure upgrades should be established, to prevent “the cost of . . . unused equipment” from “unfairly fall[ing] on the public if large customer demands fall short.
4. Hyperscalers should be able to provide transparent guarantees of long-term financial stability to utilities, thereby “protecting the utility and public if things go wrong.”
5. Also in the interests of stability and predictability, “appropriate notice periods and fees for contract cancellation or capacity reduction should be established for large energy customers.”46
Notably, these policies are readily adaptable to any large energy customer, whether inside or outside the tech sector. The CCF has been put into practice alongside a number of other Google initiatives, including:
- Partnering with NextEra Energy to restart the Iowa-based Duane Arnold Energy Center’s nuclear reactor;
- Working with CTC Global and Intersect Power to bring advanced conductors and colocated power generation online;
- Funding the electrical training ALLIANCE program to expand the U.S. electrical workforce pipeline as well as residential weatherization upgrades (a logical alternative to the plethora of federal tax deductions and credits now being phased out); and
- Investing in projects like Tapestry’s partnership with PJM to use AI to build a stronger, more resilient electricity system.
These and other initiatives allow Google to maximize the Power Usage Effectiveness of its data centers, which is lower (i.e., more sustainable) than the industry average.47
America needs a dynamic, dependable grid that can respond quickly to new growth opportunities, providing the operational reliability and resource adequacy necessary for a flourishing economy at the lowest cost to consumers. Many actors in the private sector are doing their part. Public policy should create an environment that allows supply to keep up with demand. Doing so requires wholesale reform of the U.S permitting and regulatory apparatus.
Section II. The Need for Permitting Reform
Key Points
- The current permitting system causes significant delays and cancellations, resulting in higher energy costs and reduced reliability for consumers.
- Effective reform requires comprehensive, bipartisan action to streamline reviews and reform the abuse of legal challenges to block projects.
- Certainty must protect the viability of projects before, during, and after the permitting process.
- Reform must respond to the urgency of accelerated planning and deployment of modernized transmission infrastructure to meet load growth, and with it economic and national security imperatives.
The need for additional energy infrastructure is clear; however, outdated federal laws and regulations have hindered developers’ ability to build projects on time. Modernizing permitting should maintain rigorous environmental safeguards to protect air and water quality and public safety. In fact, permitting challenges are among the biggest barriers to deploying technologies that would yield environmental improvements.
Examples of a Broken Process
America’s energy and environmental policy is unclear, overlapping, and unnecessarily complex. The result is layers upon layers of requirements that delay project development, hinder investment, and drive uncertainty. Examples abound of broken processes that have stalled or outright cancelled much-needed energy infrastructure.
For instance, National Environmental Policy Act (NEPA) litigation has served as a de facto veto on the approval and construction of natural gas pipelines. The Mountain Valley Pipeline was repeatedly halted when courts vacated federal permits due to technical NEPA flaws, forcing agencies to redo analyses that had already been completed. It was not until the pipeline became a political bargaining chip in the Fiscal Responsibility Act of 2023 that it received approval. Of course, horse trading to secure approval for one project does not fix the systemic permitting problems in the U.S. In another instance, Duke Energy and Dominion Energy abandoned the Atlantic Coast Pipeline project after years of NEPA lawsuits and regulatory uncertainty drove costs from $4.5 billion to over $8 billion.48
New England is one of the clearest and most frustrating regions where years of anti-development policy delayed or cancelled necessary infrastructure deployment. Despite strong regional demand, pipeline obstruction, enabled by federal and state environmental statutes and a litigious environment, has directly translated into higher energy prices and increased reliability risk for the region. Constitution Pipeline is one example of how regulatory overreach can choke off needed energy infrastructure. Despite securing a certificate from the Federal Energy Regulatory Commission, the project was ultimately derailed when New York denied its Clean Water Act Section 401 certification, stretching the statute well beyond its intended purpose of protecting water quality.49 After years of litigation and mounting uncertainty, developers pulled the plug. The result has been higher winter price volatility, importing LNG from Trinidad (and, in some instances, Russia50) despite sitting just a few hundred miles from abundant Appalachian natural gas, and greater reliance on pricier, less clean home heating oil.51
Many transmission line projects have faced similarly frustrating hurdles. The SunZia Transmission Project, a 3,000-megawatt line from New Mexico to Arizona, was first proposed in 2006 and approved in 2015. It then faced protracted litigation over permitting disputes and contradictory approvals from state and federal authorities. It spent nearly two decades mired in environmental reviews, supplemental impact statements, and interagency disputes before construction finally began in 2023. TransWest Express took roughly 15 years to navigate NEPA and federal land approvals to deliver Wyoming wind to Western load centers. Similarly, the Northern Pass would have delivered more than 1 GW of clean, dispatchable hydropower from Quebec to New England. Even after years of review and hundreds of millions in sunk costs, a single state decision ultimately ended the project.52
Unilateral executive actions have also stalled or cancelled energy infrastructure of all stripes, injecting more uncertainty into the process. The Obama Administration slow-walked oil and gas permits on federal lands, cancelled the Keystone XL pipeline, and issued preemptive and retroactive vetoes on mining projects in Alaska and West Virginia, respectively. The Biden Administration paused liquified natural gas (LNG) export permits and cancelled the Keystone Pipeline again. More recently, the Trump Administration slow-walked wind and solar permits on federal lands and paused the development of offshore wind projects that had already been permitted.
Policymakers should approach permitting reform as a “rising-tide-lifts-all-boats” that will provide greater certainty for project developers and better results for U.S. taxpayers and ratepayers. Meaningful permitting reform must be comprehensive, durable, and consist of both federal and state reforms.
Principles and Actions for Permitting Reform
Energy policy should not work for any specific resource or technology but for the consumer. A modernized federal energy and environmental policy framework should meet America’s energy needs in a timely and predictable manner while preserving core environmental protections. Legislative reform should shorten timelines, fix judicial review, and provide certainty for investors and project developers. As policymakers deliberate on legislative action, comprehensive permitting reform should include the following:
- Provide comprehensive, broad-based environmental reforms. Permitting and regulatory reform should be comprehensive first and foremost. Fixing problems created by a single major environmental statute will help remove unnecessary roadblocks on the margins. Still, it may fail to change project timelines if other substantial roadblocks remain. Several legislative efforts in the 119th Congress have already been proposed and advanced.53 Policymakers should modernize:
- The National Environmental Policy Act by codifying a narrowed scope of review, and excluding remote, disparate, and separate-in-time-and-place considerations—such as upstream production, downstream consumption, or generalized climate effects—when reviewing projects. If an agency is considering the environmental effects of a rail line, it should not have to consider the environmental effects of the product it carries or its end use. Doing so adds substantial time to the analysis and forces agencies into rabbit holes of “butterfly effects” for projects and actions. A narrowed scope of review was a critical aspect of the recent Supreme Court decision in Seven County Infrastructure Coalition v. Eagle County, Colorado, in which an 88-mile rail line was challenged because the environmental review failed to consider the environmental impacts of upstream oil production and downstream oil refining. In an 8–0 decision, the Court clarified and narrowed the scope of NEPA reviews, holding that agencies did not have to assess upstream and downstream impacts. Reviews should assess the project’s environmental impacts within the agency’s areas of expertise and jurisdiction. Congress should also limit the court’s power to invalidate and vacate agency action, requiring higher thresholds for legal standing and shortening the statute of limitations.54
- The Clean Water Act by preventing the Environmental Protection Agency from issuing preemptive and retroactive vetoes of 404 dredge and fill permits issued by the Army Corps of Engineers. Congress should also reform the CWA’s 401 certification process by granting states a consultative role in issuing federal CWA permits. States have occasionally used this authority to impede energy projects, such as natural gas pipelines and liquefied natural gas export terminals. Nevertheless, the state opposing the project often bases its objection not on water quality, but on issues such as climate change or train noise—concerns that fall outside the purview of the 401 certification process. Congress can also extend nationwide permits and provide greater clarity defining the Waters of the United States.55
- The Clean Air Act by clarifying that the Environmental Protection Agency (EPA) is precluded from imposing highly aggressive emissions standards absent a rigorous and transparent cost-benefit analysis, extending the time between environmental reviews from five years to ten years, and providing greater latitude to states, communities, and local authorities to develop air-quality solutions commensurate with their specific circumstances.56 Congress should also eliminate superfluous “second-review” requirements imposed by the EPA under NEPA when another agency has previously finalized an environmental impact statement, thereby reducing redundant assessments and accelerating project authorizations. Further, Congress should specify that emissions originating outside the jurisdiction of the United States or naturally (wildfires or international sources of pollution) are not factored into a state’s attainment status, thereby preventing the imposition of penalties upon states and industries for pollution for which they are not responsible.57 Congress should also reform, if not repeal, New Source Review to incentivize more efficiency upgrades to keep plants online, yield greater output, and reduce emissions.58
- The Endangered Species Act by focusing on cooperation rather than conflict. ESA has been overly focused on procedural compliance rather than the statute’s core goal of recovering species. Specifically, Section 7 consultations are too often used to broadly regulate states and project developers rather than to narrowly evaluate the actual effects of a permitted action on listed species. Refocusing consultations on biological outcomes, improving predictability for landowners, and eliminating duplicative or overly expansive regulatory demands would lead to better habitat protection and species recovery. The Property and Environment Research Center’s Field Guide for Wildlife Recovery offers reforms to restore the law’s focus on measurable recovery outcomes by improving recovery planning, fixing the “off-ramp” for delisting, and better aligning regulations with science-based goals. A central recommendation is to remove regulatory disincentives that currently discourage private landowners from improving habitat and instead treat wildlife as an asset rather than a liability. Empowering states and expanding voluntary conservation tools will make recovery proactive rather than reactive. Finally, the report argues for shifting from a penalties-first mindset toward incentive-driven conservation, rewarding agencies and landowners when species actually rebound.59
- The National Historic Preservation Act by modernizing Section 106 considerations, the Act’s core compliance process. Section 106 of the National Historic Preservation Act is integral to safeguarding cultural and historic resources; however, the current review process frequently imposes open-ended timelines and necessitates redundant consultations, thereby introducing undue risk into energy investment. Modernization efforts should prioritize establishing explicit timeframes, clearly delineating the scope of review to encompass only reasonably foreseeable impacts, and empowering lead agencies to coordinate consultations rather than sequentially layering multiple reviews. Furthermore, agencies should expand the utilization of programmatic agreements and categorical exclusions for routine, low-impact endeavors, enabling agencies to concentrate resources on matters involving genuine historic concerns. Providing more precise standards for tribal consultation and initiating early engagement would both strengthen cultural protections and mitigate unexpected issues that arise late in the project lifecycle. Finally, establishing a clear conclusion point once agencies have fulfilled their Section 106 obligations would both preserve America’s historic legacy and afford energy developers the requisite predictability for reliable, cost-effective construction. As with many federal government databases, cultural and historical data often persist in paper files and fragmented systems. Outdated systems often result in redundant surveys and delayed review processes. Expanding digitization would also modernize the process, improving the speed and saving money. 60
- Improve state siting and permitting. Permitting processes should be resource and technology-neutral and provide the necessary protections to the environment. Too often, however, states and localities have erected regulatory barriers to block specific energy projects, whether they be wind farms or natural gas pipelines. State authorities, through federal permitting processes (Section 401 of the Clean Water Act) and new state and local ordinances, have made siting, permitting, and construction of new energy infrastructure increasingly difficult. Several solutions will help. States have adopted a compliance-based permit-by-rule system that allows projects to move forward so long as they meet pre-defined criteria, conditions, and environmental standards. Utah has strengthened and expanded its permit-by-rule system, saving on time and compliance costs, providing efficient pathways to build energy infrastructure, and maintaining a healthy environment.61 Other state siting and permitting fixes would tie permitting to specific harms rather than speculative notions of harm.62 States could also preempt local bans and ordinances to minimize NIMBYism while ensuring projects meet the necessary criteria for a state permit.
- Assert more legislative authority over major environmental regulatory changes and regulatory ping-ponging. The executive branch has significant authority over energy and environmental policy. Regulatory pivots every four or eight years create massive fluctuations in policy outcomes, leading to significant shifts in planning and investment, and uncertainty for energy developers. Even with seemingly long lead times to comply with new environmental regulations, investment choices shift based on whether power plants require carbon capture and sequestration technology or whether the Supreme Court will stay the rule. Federal agencies can reshape entire sectors of the energy sector while downplaying costs, trade-offs, or the diminishing environmental returns from new regulations. Asserting more legislative control over major environmental policy would place decisions in the hands of accountable elected officials and provide a welcome check on significant regulatory swings. Policies like the Regulations from the Executive in Need of Scrutiny (REINS) Act would require congressional approval for rules with an economic impact of $100 million or more.63 This would be a welcome check on unelected executive agencies.
- Provide certainty and technology neutrality. Permitting reform must be technology- and resource-neutral and as broad as possible. The objective of permitting is not to favor or disfavor particular energy sources but to create a durable framework that ensures projects based on need and merit move forward while meeting environmental standards. The exception to technology neutrality should apply only if the reform clearly addresses technology-specific barriers, such as a regulation affecting nuclear power plants.
Certainty must protect the viability of projects before, during, and after the permitting process. Certainty should forestall actions that prevent projects from even entering review, imposing discipline and timelines during permitting, and ensuring that once permits are issued, projects are not perpetually vulnerable to reversal. If legitimate environmental or national security concerns arise, agencies should work to remediate the issue without revoking the permit. Furthermore, policymakers should also resist the temptation to legislate around individual projects or grant permit approval as part of horse trading for other policy preferences or a single project’s approval. Durable reform requires fixing the system, not carving out exceptions for politically-salient infrastructure.
- Fixes to linear infrastructure. Transmission policy should be grounded in a simple principle: transmission exists for the benefit of the consumer, not for any energy generation source, transmission developer, or provider. Investments should be justified by measurable improvements in affordability and reliability, not by social, environmental, or other superfluous public policy objectives that obscure costs and weaken accountability. The Federal Energy Regulatory Commission should adhere strictly to the beneficiary‑pays principle and the associated courts’ “roughly commensurate” standard.64 Transmission costs should be allocated only to those who benefit from the investment, and only to the degree they benefit. Where federal involvement is present, policy should prioritize expanding and upgrading the existing transmission system if those investments are the most cost-effective for ratepayers.
Furthermore, consideration of pipeline impacts should be limited to those caused by the pipeline itself—not the upstream or downstream effects of the product it carries. Expanding the scope in this way turns permitting into an indirect ban and an endless invitation to litigation. Policymakers should also extend the terms of nationwide permits and streamline water-crossing approvals to provide longer-term certainty and maximize the effective use of government resources. Pipeline projects should be allowed to move forward based on need and merit rather than on subjective national-interest determinations that shift with political winds (ie, Keystone XL and LNG exports). As with transmission, reforms must be robust enough to give developers a reasonable expectation that beneficial pipelines can be built. Without predictability, capital retreats—and consumers ultimately bear the cost through higher prices and reduced reliability.
Section III: Increasing Competition and Aligning Incentives
- State policy should prioritize wholesale and retail competition over monopoly structures to enhance grid reliability and affordability.
- States should pursue reforms to address regulatory bottlenecks, eliminate costly market distortions, and increase flexibility.
- Coordinated load forecasting can reduce the risk of costly overbuilds, prevent reliability shortfalls, and give investors the confidence to deploy capital where it is actually needed.
Policy must also enable competition, accelerate infrastructure development, and provide flexibility. A system that better responds to price signals and provides alternative pathways to supply power will meet rising demand while keeping electricity affordable and reliable.
Enabling Wholesale and Retail Competition
Wholesale electricity markets exist to ensure that generation resources compete to meet demand at the lowest possible cost while maintaining system reliability. Organized wholesale markets administered by regional transmission organizations (RTOs) provide transparent price signals that guide investment decisions, retire uneconomic assets, and reward resources that perform during periods of system stress. States that fully participate in competitive wholesale markets benefit from these efficiencies.
States with organized markets and competitive generation structures generally exhibit stronger cost discipline, better risk management, greater openness to innovation, expanded trade, and more efficiency improvements than vertically integrated monopoly states.65 Wholesale competition alone does not guarantee consumer benefits unless those efficiencies are transmitted through well-functioning retail markets.66 Retail competition empowers consumers to choose among suppliers, pricing structures, and service offerings, introducing discipline that no regulatory process can fully replicate.
The R Street Institute’s state-by-state electricity competition scorecard provides a useful benchmark for state policymakers. The analysis extends past whether a consumer can choose a retail supplier and grades states based on wholesale competition, demand-side flexibility, governance, and consumer engagement and education, among other metrics.67
Reforms in the States
In addition to increasing competition, states should pursue several reforms to address regulatory bottlenecks, eliminate costly market distortions, and increase flexibility. Even incremental expansion, whether it begins with large commercial and industrial customers, can provide consumer-centric pathways that benefit households and businesses alike. They include:
- Provide flexibility. Flexibility is an essential tool that can help prevent brownouts and blackouts and lower consumer costs. Large customers—data centers, manufacturers, or campuses—should be able to sign clear contracts that specify when, how much, and at what price they will curtail or shift load, or buy capacity from nearby users who can reduce demand during tight conditions. When load flexibility is contractible, grid operators can plan around it with confidence, just as they do with generation, thereby lowering system costs and improving reliability. Allowing companies to bring their own power is another tool that can manage load growth efficiently. BYOP would allow new large loads to secure dedicated supply through on- or off-site generation, firm contracts, or aggregated resources at the load zone or multi-node level, rather than tying supply strictly to a single meter. Done right, this lets new demand come online faster without overbuilding the grid, while ensuring that new load actually shows up with real, locationally relevant power behind it. Section IV provides more specifics for how data centers can be a flexible grid resource and a source of improved affordability and reliability rather than a burden.
- Reform Construction Work in Progress (CWIP). CWIP is an accounting treatment that allows utilities to shift the cost of major infrastructure projects onto customers before those projects deliver electricity, effectively turning households into forced financiers of grid build-out. By including CWIP in the rate base, regulators guarantee utilities a return on capital during construction, thereby weakening cost discipline, incentivizing oversized projects, and rewarding delays and overruns. A recent Manhattan Institute report highlights that, from nuclear plants in the 1970s to recent offshore wind expansions, the use of CWIP has led to early cost recovery routinely resulting in higher bills, longer timelines, and more risk pushed onto ratepayers rather than investors.68 The report offers options such as limiting CWIP to strict budget caps, making returns performance-based, imposing clawbacks for cost overruns, or reverting to Allowance for Funds Used During Construction, where recovery occurs only after the project is in service.69 Without these changes, customers will continue to subsidize speculative spending while utilities face little pressure to control costs or finish projects on time.
- Repeal and reform state mandates, subsidies, and regulations. States should reconsider and repeal predetermined energy outcomes and let the market work. Nudging or forcing consumers and businesses to comply with regulations provides preferential treatments to favored resources and technologies. The result is higher bills, growing taxpayer-funded cross-subsidies, and a market that rewards political connectedness over innovation and competition. Repealing these policies would not mean banning or punishing renewables, electric vehicles, or building electrification. Instead, they would compete on performance, cost, and consumer preferences rather than on regulatory and political preferences.
- Repeal state nuclear bans and align permitting with risk. Repealing outdated bans on nuclear power would expand supply options, improve grid reliability, and give utilities and private developers the flexibility to pursue advanced reactors where they make economic sense. States should also position themselves as proactive partners with the Nuclear Regulatory Commission by preparing and sharing key datasets that inform environmental reviews, including seismology, climate, hydrology, and sea-level rise information. The Breakthrough Institute’s Guide to NRC Application Data Requirements for State Implementers provides a useful roadmap for how states can support timely licensing.70 States can build internal proficiency in state-level licensing and permitting when the opportunity arises. States already possess the authority to regulate specific nuclear materials through the Agreement State Program, as outlined in the Atomic Energy Act of 1954.71 States can demonstrate to the Nuclear Regulatory Commission that they have acquired the requisite expertise to oversee activities such as construction, operation, and fuel cycle facilities. This initiative would encourage states to develop regulatory frameworks that facilitate the prompt construction of reactors, thereby allowing the NRC to focus its resources on essential responsibilities, including relicensing and restarting nuclear power plants in its current operational fleet.
Improving Federal, Regional, and State Coordination
One of the American federal system’s greatest strengths is that it allows policy experimentation and development that fit the unique characteristics of individual states and localities, while allowing them to be adapted to others’ circumstances. In some instances, however, this arrangement requires greater communication and coordination among jurisdictions where policies must interact within a larger structure. Such is often the case with the energy grid, and the following actions would optimize efficiency while minimizing coercive policy.
- Addressing interconnection queues. FERC Order No. 2023 fundamentally modernized the nation’s generator interconnection procedures by moving from a “first-come, first-served” queue to a cluster study process that groups multiple interconnection requests and studies them together to improve efficiency and reduce redundant work. It also established objective readiness criteria and clearer timelines, and increased transparency and certainty for developers seeking to connect to the grid. While these reforms mark a meaningful step toward unclogging backlogs and making interconnection studies more predictable and timely, they are just the beginning of what’s needed to solve the systemic delays in today’s queue process. Deeper reforms should include better, consistent study assumptions across regions, broader use of advanced transmission technologies and cost-effective network upgrade policies, and streamlined studies that embrace modern computing and automation.72 Furthermore, fast-tracking planning and proactive regional transmission planning would also reduce unnecessary barriers.
- Improve coordination and evaluation of load forecasting. Coordinated load forecasting can reduce the risk of costly overbuilds, prevent reliability shortfalls, and give investors the confidence to deploy capital where it is actually needed. That coordination will better inform assumptions across federal agencies, state regulators, utilities, and regional transmission organizations so forecasts reflect realistic expectations for economic growth, electrification, data centers, and industrial demand. Better coordination and knowledge sharing should not be misconstrued as standardization. While improved data sharing, best practices, and enhanced modeling capabilities will better inform expectations and planning, standardization may limit the tools and knowledge needed for more accurate load forecasting.
Price signals and competition will help discipline spending and empower markets to react more efficiently to the changing needs of the marketplace. Clear, predictable processes will provide opportunities for innovative generation and grid technologies to connect to the grid, while providing the certainty to make significant capital investments without having the rug pulled out from underneath them. Done right, these reforms will deliver energy affordability, greater energy security, and a cleaner environment. The alternative is managed scarcity and regulatory micromanagement, which will leave consumers paying more for less reliability.
Section IV. Unlocking Flexibility: Grid Centers as a Data Resource
Key Points
- Data centers can employ a diverse portfolio of operational and technical strategies to provide valuable services to the electric grid.
- There are significant technical and regulatory obstacles to unlocking the full potential of large-load flexibility.
- In the face of significant expansion of demand, flexibility options are an important tool in the toolbox, but not a silver bullet.
Opportunities for data centers to “flex” their grid use have attracted significant public attention and industry investment. At the simplest level, flexibility is the ability for users to change their use of grid power in response to grid operator direction or user preferences. For example, charging a phone when the grid’s power is cleaner is a small-scale example of flexibility that aligns with a user’s environmental preferences. It is also only a tiny portion of the ways flexibility can be incorporated into the grid.
At a large scale, industrial users, including data centers, have participated in traditional demand response programs for years. These involve the user reducing energy use from the grid in response to direction and often receiving payments from the grid operator. A major example, related to data centers, has been Bitcoin miners in Texas powering down when electricity prices spike.73 This led to the establishment of a technical task force within the state’s grid operator to address “large flexible loads,” which posed new regulatory and market questions.
The parallels of Bitcoin and the average data center are useful guides, but imperfect. Bitcoin is a simple product; participants run mining rigs to turn electricity into Bitcoin. This makes the business question straightforward for when Bitcoin miners should sell the power they have paid for to other users rather than use it for mining.74 A data center, in contrast, faces a more difficult question and most have trended toward desiring electricity service for every hour of every year, 100% uptime. A data center represents a computer an average person would use every day, but would never physically touch. They run essential services for critical industries, such as health care and public safety. This means data centers are often fully backed up on-site, in addition to being contracted with generators for the power they expect to use each day.75
Still, there are key attributes that enable data center flexibility in enough cases that it is worth considering.76
- Software-defined, time-insensitive workloads: Some computational tasks, such as AI model training and data processing, are not time-sensitive and can be paused or deferred with little impact.
- Rapid and precise control: Data centers are sophisticated, digitally-native facilities. Their electricity consumption can be adjusted with precision almost instantaneously in response to automated grid signals or economic price signals, making them an ideal resource for providing fast-acting grid services. Industrial users may not be able to power down in the same way—an ice cream factory would face lost product, and a steel mill would have its equipment damaged.
- Geographic and workload portability: In many cases, computational workloads are not tied to a single physical location. They can be shifted between interconnected data centers in different geographic regions to take advantage of lower energy prices or to alleviate stress on a constrained local grid.
- Modular architecture: The modular design of server racks and cooling systems allows for partial curtailment. A data center does not need to shut down completely; it can scale down its operations incrementally to provide the exact amount of load reduction required by the grid.
- Colocated energy assets: It is common practice for data centers to have on-site energy resources, primarily for reliability purposes. These assets, including large-scale backup generators and battery storage systems, can be repurposed to provide valuable services back to the grid.
Electricity System Characteristics That Make Flexibility Valuable
In addition to these five factors about data centers, there are also two elements of electricity service costs that matter. First, most electricity costs are concentrated in a few hours each year. In Texas, for example, electricity prices are $100 per megawatt-hour or less for more than 9 out of 10 hours a year. A small number of critical hours—often just a few dozen per year—drive an outsized share of total system costs. Understanding this nonlinear, power-law distribution of costs is key to unlocking the economic value of flexible resources like data centers.
This phenomenon occurs because the grid must be built to meet demand at its absolute peak, which might occur only during an extreme summer heatwave or a winter cold snap.77 The generation, transmission, and distribution infrastructure required to serve these few peak hours sits idle or underutilized for much of the year, and all customers bear its capital and operational costs. During these moments of system stress, the grid relies on expensive “peaker” power plants, and wholesale electricity prices can skyrocket. Texas has a price cap of about $5,000 per megawatt-hour. These extreme price events are infrequent but drive costs up. The price signal demonstrates the immense value of flexibility designed to avoid consumption during these costly intervals. And, of course, the price signal shows that being able to provide electricity in those hours can be incredibly valuable, so it encourages entrepreneurs to bring an entire suite of solutions to bear on the problem, not just flexibility.
The need to build the entire system to meet peak demand is the second reason for the value of flexibility. If data centers aren’t adding to those peak-hour costs but are using the system more often in other hours, then the per-unit cost of the infrastructure’s fixed costs falls. By using more electricity, you spread out the costs of the infrastructure and can delay or avoid infrastructure upgrades that are almost always shared costs. In effect, today’s infrastructure sits idle most of the time, but must be paid for regardless of that idleness; it is available when needed. Flexible data centers and other grid users can lower the per-unit costs of electricity service by using existing infrastructure more often without raising the “peaks.” To be clear, lowering rates is a potential outcome of adding new loads, not a necessary outcome, of additional data center growth.78
In total, these traits of data centers and the nature of the electricity system’s cost drivers open three broad pathways for data centers to provide grid flexibility.
Three Pathways to Flexibility
Data centers can employ a diverse portfolio of operational and technical strategies to provide valuable services to the electric grid. This portfolio represents a technology-neutral, performance-based set of options that can be tailored to specific market rules and grid needs. These strategies can be grouped into three primary categories.
1. On-site energy assets and microgrids
- This category leverages the significant on-site infrastructure that data centers often possess for reliability. These assets can be integrated with the grid to provide services beyond simple backup power.
- Thermal storage: Using thermal storage systems to pre-cool facilities during off-peak hours, reducing the need for power-intensive air conditioning during peak demand.
- Battery storage: Deploying on-site batteries to store low-cost energy and discharge it during peak hours, either for self-consumption or for injection back into the grid.
- Colocated generation: Pairing data centers directly with energy generation, like natural gas, nuclear, or a combination of storage and solar, to reduce their net grid impact. Each combination of energy technologies has its own trade-offs. No authoritative survey exists, but anecdotal evidence suggests that most colocation projects online by 2030 will rely on natural gas.
2. Demand-side management
- This category includes strategies that actively alter energy consumption patterns in response to grid conditions or prices. By moving electricity use from times of scarcity to times of plenty, these approaches help balance the grid. Key tactics include:
- Load shifting: Postponing or advancing non-urgent computational tasks to off-peak hours when electricity is cheaper and more abundant.
- Time-flexible computing: Pausing batch-processing jobs, such as large-scale data analytics or AI model training, during high-price or emergency events.
- AI workload orchestration: Utilizing sophisticated software to automatically route and schedule computational tasks based on real-time energy prices and grid carbon intensity across a network of data centers.
- Partial curtailment: Incrementally reducing power consumption by turning off a portion of servers or adjusting cooling system setpoints.
3. Market participation and aggregation
- This category involves participating directly in organized electricity markets to provide paid grid services. Data centers can act individually or as part of an aggregated portfolio.
- Virtual power plants (VPPs): A VPP is a collection of distributed energy resources orchestrated by a central software platform. As will be discussed later, distributed energy assets can provide system flexibility even if data centers or other large loads are inflexible. Already, average residential consumers participate in markets where they are paid for aggregating energy assets at home (such as electric vehicles, home batteries, or rooftop solar). In practice, this requires extensive distribution system upgrades and analysis to ensure that the assets deliver in the correct areas to relieve grid stress. A data center, or other large user, could be a cornerstone entity for VPPs by providing funding to deploy distributed energy resources within an electrically relevant area to balance the data center’s impact on the grid or shift costs away from expensive peaking hours.
- Ancillary services: In markets like ERCOT in Texas, data centers can participate in established programs that help manage grid stability or as a last-resort resource to prevent blackouts during grid emergencies.79
This wide array of strategies translates directly into tangible, system-wide benefits that enhance grid performance and lower costs for all electricity customers.
Case Study: Microsoft and Black Hills
A notable precedent for flexible arrangements is now almost a decade old.80 In 2016, Microsoft wanted to open a new data center in Cheyenne, Wyoming. The local utility, Black Hills Energy, would have needed to build a new generator, essentially only for Microsoft to use.81 In addition to contracting for wind generation in the state, Microsoft offered to build a gas generator to turn on when the grid was stressed. The data center effectively built a generator and handed the keys to the utility. The utility’s grid was strengthened by the addition of the data center, not weakened.
Case Study: Enchanted Rock’s Bridge-to-Grid Business
Flexibility is most often thought of as occurring only when users turn on or off in response to grid conditions. Yet there are many kinds of flexibility, as well as commercial providers of flexibility services. Enchanted Rock, for example, offers a “bridge-to-grid” option. The company contracts with consumers awaiting a grid connection, powering them until they gain access to the grid. Once connected, Enchanted Rock serves as an on-site backup system for the energy consumer.82 In addition, Enchanted Rock can provide grid services to help balance the system or relieve system stress, for example, by reducing a facility’s draw from the grid on those peak days when power is most expensive. A notable benefit is that, because Enchanted Rock generates revenue from using its backup system in these markets, it reduces the total costs its customers must pay.83 Solutions like Enchanted Rock’s likely reduce costs for the entire system and all users by providing an alternative to new peaking assets.
Enchanted Rock is already playing a role in some data center sites. At Meta’s El Paso data center, for example, the first years of the power will be completely provided from Enchanted Rock’s bridge services.84 This shows that grid updates and expansion take years, while data centers often need power today. In the proceedings to approve the data center, for example, developers were clear that they chose Enchanted Rock because of the “shorter lead time” and other benefits.85 Bridge-to-grid solutions enable large users to obtain power sooner as the system expands.
Most data centers employ similar backup systems that can be leveraged, just like Microsoft’s gas generation system or Enchanted Rock’s bridge-to-grid business. These are often powered by diesel, however, so they entail environmental trade-offs and often strict runtime limits. In a future where energy technologies evolve, these colocated energy assets will likely evolve as well. There are no fundamental reasons advanced nuclear, solar, and storage cannot serve as the basis for similar businesses in the future.86
Case Study: Base Power and Distributed Dispatchability
Base Power is advancing a distributed-energy model that turns networked home batteries into a flexible grid resource capable of responding to real-time system needs. The company installs residential batteries for Texas homeowners that provide backup for up to 48 hours. When it is not in use, Base can collectively discharge power back to the grid during peak demand. Since these batteries are quick to install, their orchestration creates fast-response capacity without waiting years for new generation or transmission to come online.87
Though a Base battery is only at one user’s home, the entire system benefits. By charging when electricity is inexpensive and exporting power when prices spike, Base helps smooth demand volatility and reduce system stress. For some systems, this could accommodate load growth. Importantly, this enables even inflexible demand to approximate flexibility by creating a market. That is, a data center that is always on can pay for the flexibility of other users. These kinds of voluntary exchanges are uncommon only in electricity policy.
Two primary barriers to testing the abilities of distributed assets to enable load growth are worth mentioning. First, most electricity markets are not as friendly to Base’s business model. An open, competitive electricity market like Texas’s allows Base to bid back into the power market. Policymakers outside of Texas can consider updates to better enable distributed systems to compete and learn from Texas’s institutional design.88 Second, red tape slows the deployment of distributed energy assets, which would then need to be knit together to create space for new loads. For rooftop solar paired with storage, around 1 out of 2 dollars of an installation’s costs are soft costs related to permitting, inspections, and have nothing to do with the hardware itself.89 A solution exists to ensure that cities and states have clear and simple processes for verifying proper installations. Permits can be issued in hours rather than weeks with the right improvements.90 Policies allowing inspections by qualified engineers and contractors in addition to city staff are promising and can build on common third-party programs across the country.91
Case Study: Emerald AI and Software-Driven Flexibility
Emerald AI is a flexibility management platform that orchestrates data center workloads in real time. This enables data centers to dial electricity use up or down in response to grid conditions, effectively turning a traditionally rigid load into a responsive grid resource. In a field demonstration conducted with Oracle in Arizona, Emerald’s platform orchestrated data centers to reduce their power consumption by 25% during a three-hour grid-stress event, while maintaining AI compute quality.92 The company is now scaling that concept through the 96-MW Aurora AI Factory in Virginia, a collaboration with NVIDIA, Digital Realty, and PJM designed to prove that large AI facilities can dynamically align with grid conditions.93 Emerald’s model, which is just one of many demand response commitments,94 can help unlock existing transmission capacity and potentially accelerate interconnection timelines for new digital infrastructure.
Flexibility Is Promising, But Not a Silver Bullet
In the face of potentially adding 166 gigawatts of new demand, flexibility options are more like a tool in the toolbox than a silver bullet. The growing demand of this scale must be met by a growing supply as well. Most of the flexibility’s value comes from exploiting the system’s overbuild that exists outside the rare, peak days. The authors of a well-publicized early analysis of flexibility made this clear in the summary of their report:
This analysis should not be interpreted to suggest the United States can fully meet its near and medium-term electricity demands without building new peaking capacity or expanding the grid. Rather, it highlights that flexible load strategies can help tap existing headroom to more quickly integrate new loads, reduce the cost of capacity expansion, and enable greater focus on the highest-value investments in the electric power system.95
Then, there are other limits on flexibility. For one, many flexible solutions are not widely adopted. Second, as more users apply for the capacity enabled by flexible operations, the supply of underused capacity that enables them may dry up or be available for fewer hours. Third, certain flexibility solutions, such as storage or DER aggregation, are not long-term solutions. A battery at home will be drained over a few hours. Even gas supply must be firm and resilient to freezes or supply disruptions. A similar challenge applies to diesel backup generators.
Fundamentally, flexibility is not always of interest to data centers themselves, just as many users do not value phone settings that charge based on the grid’s supply. Even when it is of interest to data centers, it is not a free service nor one they are expected to provide without compensation. Proposed flexibility rules in the PJM grid were roundly rejected because there seemed to be no upside for data centers. Grid operators and flexibility advocates should remember that there is no such thing as a free lunch.
The right releveraging tools that use flexibility to enable faster load additions to the existing system are clearly of interest to industry. These should be voluntary and require retooling existing rules and embracing innovation.
How to Enable Voluntary Flexible Demand
There are significant technical and regulatory obstacles to unlocking the full potential of large-load flexibility. Existing market rules and utility frameworks were designed for a 20th-century grid where power flowed in only one direction—from central power plants to passive consumers. Traditional flat electricity rates fail to communicate the real-time value of energy and the moments when the grid is most stressed. These flat-rate structures were logical in a system of centralized, dispatchable generation, but they are fundamentally misaligned with a modern grid characterized by variable renewables and dynamic demand. Without sharp price signals, there is no financial incentive for grid users to be flexible. Industrial users and data centers likely already face at least time-varying rates. Broadening these offerings to more consumers, including residential consumers, thickens the market for potential flexibility providers in addition to improving economic efficiency.96 Modernizing these structures is essential to enabling a more dynamic, multi-directional energy future.
Key barriers include:
- Restrictive interconnection rules. Current interconnection processes, designed for static loads, are a critical bottleneck that fails to recognize or reward the grid-stabilizing potential of flexible loads.97 These processes are often slow and costly and do not account for the grid-supporting capabilities of flexible loads.
- Outdated rate design. In the past, loads had few options when connecting to the grid. They were treated as firm and passive. This fails to capture how technological advances have expanded the range of services that loads can provide. In recognition of this, the Federal Energy Regulatory Commission ordered PJM operators to create new rules recognizing both opportunities for non-firm service and bridge-service, and to better account for behind-the-meter generation colocated with users.98 Similarly, the Department of Energy is requiring consideration and rulemaking around flexibility’s role in speed to power.99 FERC’s recent actions and the Department of Energy’s Advanced Notice of Proposed Rulemaking emphasize better price formation and demand-side participation, but must be done in a way that maintains the balance between state and federal oversight and reduces litigation risk.100
- Barriers to market participation. Rules governing participation in wholesale electricity markets can be complex, limiting or preventing flexible loads from offering their services. In many places, especially outside regional transmission operators, the barrier is a missing market. Where a monopoly utility is the only provider, it is a gatekeeper incentivized to build its own solutions rather than pay others for services.
- Restrictions on backup generation. Environmental permits and utility rules often strictly limit the number of hours that backup generators can operate. These rules are important for local air quality. At the same time, these backup systems can prevent the on-site capacity from being used as a critical grid resource during system-wide emergencies. If they run only in emergency situations, it is hard to object to greater reliance on backup generation. Environmental rules should be set to balance these factors, and well-designed rules incentivize the development of backup technologies without environmental trade-offs.
Targeted policy reforms can effectively dismantle these barriers, paving the way for data centers and other large flexible loads to become a cornerstone of a more reliable and affordable grid. The rapid growth of data centers is one of the defining energy trends of the decade. Rather than treating this new load growth as an insurmountable problem, policymakers have a clear opportunity to harness the value of voluntary flexibility. Far from liabilities, data centers can be assets to grid reliability and affordability.101 To achieve this, state and federal policymakers should prioritize the following reforms:
- Expedite interconnection for flexible loads. Create expedited, preferential interconnection pathways for loads that commit to providing grid services or to reducing their grid power use during certain hours. This approach properly recognizes these facilities as a valuable resource rather than simply a burden on the system. It creates a carrot for data centers to experiment with flexibility. Texas, in 2025, considered proposals like this in House Bill 3970 and Senate Bill 1942.102 Similar proposals have been made in major grids across the country.103
- Modernize market rules. Enact reforms that allow flexible resources, including data centers, to seamlessly participate in all energy and ancillary service markets. Compensation should be based on performance, ensuring that resources are paid the full value of the reliability and economic benefits they provide to the grid.
- Reform utility rate structures. Promote the adoption of non-firm service options for loads, and dynamic rate designs (such as real-time pricing), which provide strong financial incentives for consumers to reduce consumption during peak hours. Such price signals are a fundamental element in every market.
- Update reliability and permitting rules. Review and update regulations to enable the dual use of on-site and backup generation as both facility reliability tools and grid assets. This requires updating National Electric Reliability Corporation (NERC) rules and developing new tools as part of interconnection studies.
The staggering growth in power demand from data centers need not be a crisis for the American grid. While flexibility is not a silver bullet—and cannot replace the fundamental need for new generation and infrastructure—it is a promising tool in our toolbox. By moving beyond the 20th-century model of static loads from passive consumers, we can unlock a more dynamic system where data centers act as shock absorbers for the grid. This new system may reduce overall system costs. With the right updates, policymakers can turn data centers into assets for grid reliability.
Section V. Thinking Outside the Grid: The Consumer Regulated Electricity Alternative
Key Points
- Policymakers should come to terms with this stark reality: meeting the demands of the data center era requires thinking outside the legacy grid.
- Consumer Regulated Electricity would allow privately financed, physically islanded electric utilities to serve new, voluntary customers such as data centers or industrial facilities.
- Instead of competing through subsidies or narrow tax provisions, states can allow private infrastructure to form freely, letting developers finance, build, and operate their own islanded power systems without drawing on public funds.
For the first time in a generation, the defining characteristic in American energy policy is not fuel, technology, or capital. It is speed, and everyone is focused on speed to power. America cannot lead in AI, industrial revival, or economic growth if our electricity supply remains sluggish.
But what if America’s power grid is destined to be slow? For the past 20 years, overall electricity demand growth has been essentially flat (0.1% annual growth between 2005 and 2020)104, which masked the grid’s inability to move quickly. No one knew how lethargic the grid had become until we asked it to pick up the pace to accommodate the quick rise in demand from the data center industry.
The era of low and predictable growth has ended. By some estimates, electricity demand in the U.S. will increase by 50% between now and 2050105, and policymakers are desperate to find ways to meet the challenge. What this new era calls for is an electricity industry that (1) moves quickly enough to meet the needs of new customers and (2) avoids burdening American families and businesses with skyrocketing electricity rates.
Meeting these two goals simultaneously may not be possible under today’s electric utility paradigm. Changes to retail rate design106 and interconnection procedures107 are no doubt necessary, but they could prove insufficient to meet the moment, or the solutions may come too late. Policymakers should come to terms with this stark reality: meeting the demands of the data center era requires thinking outside the legacy grid.
Today’s Time Crunch
Something has to give. The electricity demand driven by AI and hyperscale data centers is arriving on timelines measured in months. State and federal regulators operate on timelines measured in years if not decades. Gigawatt-scale projects are being announced faster than traditional grid planners can study and permit them, not to mention the additional time it takes to build the new infrastructure once it’s been planned and permitted.
The legacy electricity grid—the interconnected system built piecemeal over the past century or more—moves slowly because it must. The traditional electricity model was designed to provide universal service, maintain high levels of reliability, and expand gradually under centralized planning and tight economic regulation. Protecting existing customers is important, which is why regulators are reluctant to make drastic changes to the shared grid.
Unfortunately, the institutional machinery created over the past century has become a massive obstacle to progress. Incremental policy improvements will help, but they cannot eliminate the fundamental mismatch between the slow-moving legacy power grid and the fast-paced customers who need massive amounts of electricity as soon as possible. America’s data center industry should look beyond reforming the old system and allow space for new systems to emerge.
Permission to Build Something New
One policy concept gaining traction in some quarters of the policy community is Consumer Regulated Electricity (CRE). It begins with a simple but powerful insight: the fastest way to accelerate investment in electricity may be to allow entrepreneurs to build outside the legacy regulatory structure—without imposing any risk on existing customers.
CRE would allow privately financed, physically islanded electric utilities to serve new, voluntary customers such as data centers or industrial facilities. Crucially, these systems would not: (1) interconnect to the incumbent grid, (2) rely on regulated rates, or (3) socialize costs to the public or increase blackout risks. They would operate through a purely voluntary and private structure—private contracts, private capital, and private risk.
This approach changes the political economy of electricity. CRE is not another attempt at deregulating the old grid. It is a permissionless approach to building completely new networks. CRE mirrors how America built nearly every other major network industry—including railroads, pipelines, fiber-optic broadband, and cloud computing. A CRE utility could move quickly because it wouldn’t have to stop and ask a regulator for permission to build.
The finer points of CRE are laid out in a recent Cato briefing paper.108 However, the gist is straightforward: if sophisticated customers want to move quickly by purchasing from private networks (and a potential CRE utility wants to provide them service), why should anyone hold them back? More importantly, what could we learn from these emerging systems, and what kinds of innovation might they bring? If there’s one thing the electricity sector needs to keep up with the hyperscalers, it’s a fresh dose of innovation.
Case Study: Is Utah a Glimpse into the Future?
The CRE concept may sound theoretical, but it’s already becoming a reality. In central Utah’s Millard County, a new model of data center infrastructure is taking shape—one that deliberately sidesteps the traditional electric grid by generating power on-site and scaling independently of utility interconnection constraints. The developer behind this effort is building a power-and-compute campus on 4,000 acres designed to grow from roughly 455 megawatts of self-generated capacity by late 2026 to as much as 12 gigawatts by 2032.109
By internalizing all of the infrastructure, the Utah campus can offer customers reliable power without shifting costs to local utility customers, introducing new reliability problems110 on the existing grid, or sacrificing a decade waiting for grid upgrades. As demand for AI computing grows, the Utah model could serve as a template for future off-grid data centers that enable rapid deployment while relieving pressure to expand the existing grid.111
From State Experiments to Federal Policy
State-level developments such as the Utah campus are unfolding alongside a broader federal conversation about speed to power.112 National policymakers increasingly recognize the need to serve data centers quickly and protect consumers.113 Recent efforts by the Department of Energy and the Federal Energy Regulatory Commission (FERC) are commendable, but federal pressure alone cannot rejuvenate a sclerotic grid.
Federal lawmakers’ approaches vary. Senator Cotton has embraced CRE and introduced a bill that would exempt electrically-islanded utilities from the onerous regulations designed for the interconnected grid.114 In contrast, Senators Hawley and Blumenthal want to mandate that new data centers go off-grid.115 Across the ideological spectrum, data center skepticism is growing.
The big unknown is how the next presidential administration will treat the industry. Under the bill proposed by Senator Cotton, a CRE utility could function as a hedge against the political uncertainty in the federal executive branch.116 Investments in new infrastructure could proceed even if the next president or the next chairman of FERC wants to pull the plug on every new data center.
Taxpayer-Friendly Economic Development
In recent years, states have pursued data center development through targeted tax incentives, abatements, and subsidies designed to lure hyperscale facilities across state lines. By one count, 37 states offer tax incentives to data centers, which shift the state’s financial burden onto other taxpayers and compete with other uses of public funds.117
Hosting a CRE utility would reverse that logic. Instead of competing through public giveaways, states can compete by allowing private infrastructure to form freely, letting developers finance, build, and operate their own islanded power systems without drawing on public funds. CRE utilities would transform data-center recruitment from a fiscal liability into a market-driven opportunity that safeguards public budgets while enabling growth.
This shift matters because the scale of AI-era investment in both data centers and electricity supply is simply too large for traditional subsidy models to handle. Gigawatt-scale campuses can require billions of dollars in generation, transmission, and cooling infrastructure—costs that, under the conventional framework, are frequently socialized through tax policy or utility rate structures. CRE keeps the costs private while retaining the state’s tax revenues.
Instead of a race to offer the largest subsidy package, states would compete on more durable factors: regulatory openness, deployment speed, and certainty of power supply. Utah’s emerging off-grid model hints at this possibility—a future in which attracting next-generation industry depends less on public incentives and more on allowing entrepreneurs to deliver electricity abundance directly.
Empowering the Market
Policymakers in the United States now face a choice. We can attempt to force a twentieth-century system to meet twenty-first-century timelines. Or we can do what America has done at every great turning point in its history: allow innovators to build something new—faster, freer, and sufficiently abundant to power the next industrial revolution.
Section VI. AI, Water Rights, and an Opportunity for Property Rights
Key Points
- Building data centers in water-scarce areas may raise public concern, but there is no legal mechanism for diverting water without first obtaining a water right.
- The price of water in the West reflects the cost of delivery, not the scarcity of water. Stronger price signals would better incentivize consumers to reduce water use.
- By focusing on lowering the cost of water transfer, politicians can avoid writing policy carveouts that might be specific to a time and place but become irrelevant in the future.
According to a 2025 estimate from Axios, the United States is projected to grow the number of data centers from 4,000 to 7,000—a 75% increase in the industry.118 However, this increase is not without public concern. Tweets, Instagram posts, and news articles boast headlines of the impending environmental consequences AI will bring. Among the foremost of these environmental concerns is water use. An article published by the Washington Post, for example, claims that a 100-word email penned by ChatGPT requires an entire bottle of water to produce.119
With growing concern about water scarcity, environmentalists are right to pay attention to new water demands.120 But water use, particularly its definition and quantification, is not as intuitive and straightforward as headlines make it seem. To understand the environmental impact of new data centers, specifically their water demand, we must examine three topics: how water is allocated, how water use is defined, and who uses water. As policymakers consider the expansion of hyperscalers in their respective states and districts, they should recognize the following principles:
- Legal systems already restrict data centers’ ability to divert water.
- Data centers account for only a small share of water use compared to other industries.
- Allowing water to be priced through transfers can resolve concerns about emerging water demands and incentivize water conservation.
New Demand, Old Demand, and Obtaining Water Rights
Much of the current policy debate around water allocation centers on how to meet new, increasing demand given expected supply variability.121 When new demands, such as AI, arise in water-scarce regions, they must obtain a water right before construction begins. This water right, issued under the Doctrine of Prior Appropriation, details strict rules about water diversion.
Water in the West cannot simply be diverted; rather, individuals who want to use water must file for an appropriative right, which will be granted only if there is water available within the region to be allocated. Once a right has been obtained, a date is assigned to that right. The date determines the water rights’ “place in line.” In a year when the water supply is low, those with the most recent dates cannot divert or use their water. This allocation system ensures that those who were first in line to file for water are also first in line to receive water.
This system means that, if new demand, such as data centers, enters a market, it cannot immediately divert water. Data centers must go through an administrative process to obtain a water right, which an administrative agency will approve only if there is water to allocate. That particular right will not affect any other existing water users. And, even if there is water to allocate, a low priority (rights filed later) implies that new rights may not be able to divert all the water they need.
Building data centers in water-scarce areas may raise public concern, but there is no legal mechanism for diverting water without first obtaining a water right. In fact, as the water rights system stands, there is an incentive against establishing water-intensive businesses. Data centers, no matter how in demand, must still follow these rules.
Understanding Water Use
When applying for a water right, data centers must specify the amount of water they anticipate using. While bottles of water are intuitive to the general public, those who use water as an input in production, such as farmers or city planners, think more in terms of acre-feet or rates. This difference is largely due to the quantity of water industries use; for example, one acre-foot of water is equivalent to 356,000 gallons or 2,466,000 water bottles.122
Many industries use water as an input. Cities use water for municipal demand, agriculture uses it for irrigation, and many recreational activities—like boating, fishing, and swimming—require water. However, even from this example, it is clear that not all three of these industries use water in the same way.
Water use is often discussed in three parts: diversion, consumptive use, and return flow. Diversion refers to the amount of water taken from a stream or well for a specific activity. As an example, a farmer may divert an acre-foot of water from a stream to irrigate his fields. Consumptive use is a portion of the diverted amount; it refers to the quantity of water consumed by an activity. This portion will vary based on the intensity of a particular activity. Water diverted to fill a pond will evaporate only, while a farmer irrigating his field will consume both the evaporated water and the water his crops absorb. Finally, the return flow is the amount of water that, once diverted and applied, is returned to the stream. It is directly related to the consumptive use. In fact, it can be roughly described as return flow = diversion – consumptive use.
It is important to note that many water-use estimates describe diversion rather than consumptive use, even though consumptive use is one of the more relevant (but not the only) environmental factors. For instance, two activities could divert the same quantity of water; if the consumptive use is near zero, one activity has a negligible water footprint while the other has a higher footprint. This is particularly important when discussing data center water use. Recent technological advancements enable quantification of consumptive use. The most notable example of a measurement advancement is OpenET, which provides field-level estimates of water use based on satellite data.123
Comparing Water Use
According to a report generated by Berkeley Labs, data centers stand to consume 21.2 billion liters—or 17,000 acre-feet of water in 2022.124 By 2024, that number grew to 66.6 billion liters—or 53,000 acre-feet of water consumed. Projected growth implies that data centers could demand up to 124 billion liters of water—or 100,000 acre-feet. While these numbers appear large, they are relatively meaningless without industry context. Who else uses water, and how much do they use?
Agriculture uses 74% of water in the American West, making it a useful comparison.125 Many farms in the West grow alfalfa or corn, which require between 3 and 5 acre-feet of diverted water per acre planted.126 Once applied, consumptive use for these crops can reach 97%, meaning that, for every four acre-feet of applied water, nearly all is consumed.127
In California, for example, the average farm is 380 acres, with roughly 63,000 farms.128 If consumptive use was even conservatively estimated as two acre-feet per acre planted, a back-of-the-envelope total consumptive water use for the California growing season would be about 47.8 million acre-feet.129 This number is 478 times larger than the largest predicted water use for data centers. Other researchers, among them Brian Potter at the Institute for Progress, have utilized different measurements to suggest that industries such as mining and forest products, have greater water use than the data industry.130
Data centers can range from zero to 80% consumptive use, depending on the cooling system adopted.131 Some cooling systems rely heavily on evaporation, meaning much of the water diverted is released into the air and never returned to the source. On the other hand, systems that use an air chiller or are closed-loop divert water initially, then keep it within the system, allowing it to be returned to the stream.132 Unlike agriculture, data centers can sharply reduce water use but require an incentive to do so. If water use is a large portion of an operating budget, data centers have an incentive to adopt technologies that use less water or to invest in developing those technologies. But the price of water in the West reflects the cost of delivery, not the scarcity of water. Moreover, data centers that reduce their water use over time would be penalized by the Doctrine of Prior Appropriation. Stronger price signals, on the other hand, would better incentivize consumers to reduce water use.
Identifying the Source: The Need for Price Signals
Prices incentivize conservation and innovation across industries. Companies routinely find ways to make their products more material-efficient or invest in energy-efficient technologies because it costs less. Water markets, on the other hand, can be trickier due to the existing legal system.
First, water prices have not emerged because it is difficult to transfer.133 Prior appropriation requires that water be delivered in order of seniority—the place in line—not by spatial location. This condition means that any individual transferring water must prove that no one else’s place in line is affected by that transfer. The condition of non-injury often triggers court proceedings and substantial administrative burdens for those seeking to transfer water.
Second, prior appropriation also requires that water be put to a consistent and beneficial use; rightsholders who fail to use their water will have their water rights reduced without compensation. In many states, water conservation is not considered a beneficial use. Water rightsholders often refer to this condition as the “use-it-or-lose-it” clause, which sharply disincentivizes water conservation.134
The problem of new water demands is not specific to data centers, though. In fact, this debate has circulated for decades: articles have critiqued farmers for growing alfalfa in the desert, cities for being in desert regions, and golf courses for irrigating during droughts.135 In essence, all of these complaints are about why water is being used for things that do not appear socially valuable. But blaming data centers for “draining the West” is similar to blaming farmers for growing alfalfa in the West; both groups are responding to trade-offs. As one article noted, water in the West is cheaper than electricity. Data centers face a trade-off: use less water and more electricity, or use less electricity and more water.136
Tradeoffs, though, require prices. And prices emerge only through transfer. Water markets would address many of the concerns about water quantity and quality surrounding data centers.137 Instead of filing for new diversion rights, data centers would be required to purchase existing water rights to use for their operations.
This exchange means that, if they require large quantities of water, they would have to pay the market price. If they purchase water in a water-scarce region, it will necessarily cost more due to low supply. Additionally, an active water market would also incentivize water conservation efforts among those who stand to gain the most. Instead of increasing the total diversions allowed and cutting users off when there is not enough, transfers encourage new demanders to purchase or lease existing water rights.
Water markets offer an added benefit that also addresses other concerns about the quality of water returned to streams by data centers. As water rights get traded, there is a price incentive to distinguish the quality of available rights amongst sellers. This incentive means that both water rights sellers and existing water rights holders could view protecting water quality and temperature as an investment. Low-quality or warmer return flow could make those water rights less valuable on the market.
Policy Fixes to Establish Better Water Markets
Data centers and the AI boom are not draining the West.138 In water-scarce regions, data centers must still apply for water rights, meaning water cannot be “stolen.” When comparing the water use of data centers to agriculture, which accounts for 74% of water use, data centers are not only more efficient but also have greater potential for efficiency gains. To reduce AI’s water footprint, businesses need the right incentives.
Public outcry over water use is nothing new, and the West has an opportunity to address an age-old concern. The existing system of prior appropriation is not flexible enough to address the dynamics of new demand or climate variability, but it can be reformed.
Establish water conservation as a beneficial use. Industries can be incentivized to implement water conservation by removing the penalty for reducing use. Water rightsholders who conserve water over time can retain ownership of their water use and will not risk losing their water rights if they do not use it. Utah, in its efforts to restore the Great Salt Lake, has expanded its definition of beneficial use to include conservation efforts.139
Expedite consumptive use transfers. It is not enough to permit water conservation. Conservation needs an incentive, and prices create it. While the non-injury requirement is unlikely to change, consumptive use transfers, by definition, have no impact on return flow and thus do not trigger injury to other users. Expediting consumptive use transfers allows companies that conserve water to profit from it.140
By focusing on lowering the cost of water transfer, politicians can avoid writing policy carveouts that might be specific to a time and place but become irrelevant in the future. While data centers may be socially controversial today, that doesn’t mean they will be in a decade. The West’s water challenges predate AI and will outlast any single industry. Reforming prior appropriation to allow conservation and transfers is not a concession to the technology sector—it is long overdue. Data centers are simply the latest reminder.
Conclusion
The United States stands at an inflection point. Electricity demand from AI, advanced manufacturing, and broad-based electrification is accelerating. Yet, the policies governing how the private sector builds and connects energy infrastructure remain rooted in a slower, centralized, and bureaucratic system. As this paper demonstrates, the core problem is not a lack of resources or capability, motivated capital, or potentially game-changing technologies. Instead, it is a system that rewards delay, discourages innovation, and constrains supply. Lengthy permitting timelines, interconnection bottlenecks, and misaligned incentives are raising costs and heightening reliability risks at precisely the moment the country needs speed, scale, and flexibility. The solution is not to micromanage which technologies win, but to remove the policy barriers that prevent markets from delivering affordable, dependable power to American households and businesses.
Policymakers should pursue comprehensive, durable reforms that restore competition, align incentives, and allow both established and emerging energy solutions to compete on their merits. That means modernizing environmental reviews, fixing interconnection processes, embracing grid-enhancing technologies, and enabling large flexible loads like data centers to become assets rather than liabilities for the grid. Implemented properly, these reforms can unlock faster infrastructure deployment, strengthen reliability, and put downward pressure on electricity costs while supporting economic growth. The alternative is continued regulatory whiplash, managed scarcity, and rising consumer bills. America has the resources, the capital, and the ingenuity to meet the moment. What it needs now is a policy framework that lets the energy system move at the speed of power.
1 See, for example, Andrew Wilford, “Youngkin Shouldn’t Buy a Bad Stadium Deal with a Bad Budget Deal,” Daily Caller, February 29, 2024, WILFORD: Youngkin Shouldn’t Buy A Bad Stadium Deal With A Bad Budget Deal | The Daily Caller; David McGarry, “Stadium Subsidies are the New York Mets of Public Policy,” Taxpayers Protection Alliance, June 15, 2023, Stadium Subsidies are the New York Mets of Public Policy - Taxpayers Protection Alliance; and Heywood Sanders, “Space Available: The Realities of Convention Centers as Economic Development Strategy,” Brookings Institution, January 2005, 20050117_conventioncenters.pdf.
2 One study from PwC, for instance, asserts that 6 indirect or supply chain jobs are supported by each direct job connected to data centers. While this methodology has critics and defensible concerns, more important (and less deniable) is the overall trend PwC notes: direct data center employment has grown by roughly 50% between 2017 and 2023, compared to just 10% employment growth in the overall U.S. economy. See the PwC study from February 2025 entitled “Economic Contributions of Data Centers in the United States,” commissioned by several firms in the data center space, at: https://www.datacentercoalition.org/reports-and-publications.
3 See, for example, Narrative Strategies Poll, “Nexus Pulse: Voter Perceptions on Data Centers,” released March 26, 2026, Nexus Pulse - March 2026 | Voter Perceptions on Data Centers. Among the findings: “Most Americans view data centers through the lens of economic development, IT infrastructure, and community planning, not politics, indicating the narrative is still forming. Voters are four times more likely to say data centers are a technology and economic issue (39%) than a political one (9%);” and “82% of Americans report a more positive attitude toward data centers when informed about job creation and community investment.”
4 See, for example, Maria Koklanaris, “States Eye Repeal of Costly Data Center Tax Breaks,” Law 360, March 18, 2026, States Eye Repeal Of Costly Data Center Tax Breaks - Law360 Tax Authority;
5 See, for example, Jared Walczak, “State Taxation of Data Centers,” Tax Foundation, December 2025, FF871-1.pdf.
6 See, for example, Eric Schmid, “Franklin County Hits Pause on Data Centers after Tense Meeting with Overflow Crowd,” St. Louis Magazine, January 21, 2026, Franklin County hits pause on data centers after tense meeting| St. Louis Magazine; John Gerding, “What’s next for Franklin County Data Center Proposals,” Spectrum News, March 20, 2026, What’s next for Franklin County Data Center proposals; and William Carroll, “Significant tax revenues totaling over $77 million projected for data center project,” Warrenton County Record, January 30, 2026, Significant tax revenues totaling over $77 million projected for data center project - Warren County Record.
7 See, the White House: “President Donald J. Trump Advances Energy Affordability with the Ratepayer Protection Pledge,” March 4, 2026, Fact Sheet: President Donald J. Trump Advances Energy Affordability with the Ratepayer Protection Pledge – The White House.
8 See Joe Bishop-Henchman, Demian Brady, Debbie Jennings, Leah Vukmir, Andrew Wilford, Jess Ward, Matthew Putnam, and Mattias Gugel, “What Happens If the 2017 Tax Cuts Expire? A State-by-State Analysis,” National Taxpayers Union Foundation Issue Brief, May 1. 2025, What Happens If the 2017 Tax Cuts Expire? A State-by-State Analysis - Foundation - National Taxpayers Union.
9 See, for example, Matthew Putnam, “Truth in Taxation: A Solution to the Growing Property Tax Problem,” National Taxpayers Union Foundation Issue Brief, July 2, 2025, Truth in Taxation: A Solution to the Growing Property Tax Problem - Foundation - National Taxpayers Union.
10 See, for example, Julia Shapero, “Sanders, Ocasio-Cortez unveil bill to halt data center construction, The Hill, March 25, 2026, Bernie Sanders and AOC propose AI data center moratorium bill; and Amelia Davidson, “Trump wants to move on data centers. Not so much Congress,” Politico, February 27, 2026, Trump wants to move on data centers. Not so much Congress. - POLITICO.
11 See, for example, Stanford University’s Human-Centered Artificial Intelligence 2025 AI Index at: https://hai.stanford.edu/ai-index/2025-ai-index-report/economy.
12 Brendan Bordelon and Gabby Miller, “The Tech Industry’s Political Maze on AI Data Centers,” POLITICO, February 6, 2026, https://www.politico.com/news/2026/02/06/tech-industry-ai-data-centers-politics-00762348
13 The Washington Post, “Power shutoffs surge, electric bills rise,” November 24, 2025, https://www.washingtonpost.com/business/2025/11/24/power-shutoffs-surge-electric-bills/
14 Ibid.
15 U.S. Energy Information Administration, “In 2020, 27% of U.S. households had difficulty meeting their energy needs,” Today in Energy, April 11, 2022, https://www.eia.gov/todayinenergy/detail.php?id=51979
16 U.S. Energy Information Administration, “Electricity explained,” accessed 2025, https://www.eia.gov/todayinenergy/detail.php?id=65264.
17 Grid Strategies, *National Load Growth Report 2025*, 2025, https://gridstrategiesllc.com/wp-content/uploads/Grid-Strategies-National-Load-Growth-Report-2025.pdf.
18 Ibid.
19 Paul Ciampoli, “U.S. Utility Large Load Commitments Reach 160 GW Amid Unprecedented PJM Demand Surge: Report,” American Public Power Association, October 29, 2025, Public Power, https://www.publicpower.org/periodical/article/us-utility-large-load-commitments-reach-160-gw-amid-unprecedented-pjm-demand-surge-report
20 Ryan Wiser et al., Factors Influencing Recent Trends in Retail Electricity Prices in the United States Lawrence Berkeley National Laboratory & The Brattle Group, October 2025, https://eta-publications.lbl.gov/sites/default/files/2025-10/full_summary_retail_price_trends_drivers.pdf. An update released in April 2026 confirms the trend has continued, to include last year: https://eta-publications.lbl.gov/sites/default/files/2026-03/retail_price_trends_2026_edition.pdf.
21 Ibid.
22 NRG Energy, “High power bills got you down?,” policy briefing, 2025, https://www.nrg.com/assets/documents/energy-policy/dlcc-briefing-high-power-bills-got-you-down-121225.pdf.
23 Lawrence Berkeley National Laboratory, “Retail Electricity Price Trends and Drivers,” October 2025, https://eta-publications.lbl.gov/sites/default/files/2025-10/full_summary_retail_price_trends_drivers.pdf.
24 Ibid.
25 Roland, Samuel, and Daniel King. Grid Policy for the AI Demand Surge. Foundation for American Innovation, 2026, pg. 26-27, https://www.thefai.org/posts/grid-policy-for-the-ai-demand-surge.
26 Lawrence Berkeley National Laboratory, *Queued Up: Interconnection Queue Trends 2025*, December 15, 2025, https://eta-publications.lbl.gov/sites/default/files/2025-12/queued_up_2025_edition_12.15.2025.pdf.
27 Ibid.
28 C3 Solutions, Testimony of Nick Loris before the U.S. House of Representatives, September 2025, https://c3solutions.org/wp-content/uploads/2025/09/Loris_HR4776_Testimony_FINAL.pdf.
29 National Petroleum Council, Permitting Reform Report, 2025, https://permitting.npc.org/files/2025_Permitting_Report.pdf.
30 Ibid.
31 McKinsey & Company, “Unlocking US Federal Permitting: A Sustainable Growth Imperative,” July 28, 2025, https://www.mckinsey.com/industries/public-sector/our-insights/unlocking-us-federal-permitting-a-sustainable-growth-imperative
32 Devin Hartman et al., Comments by the R Street Institute on Improvements to Generator Interconnection Procedures and Agreements, R Street Institute, October 13, 2022, https://www.rstreet.org/wp-content/uploads/2022/10/Comments-by-the-R-Street-Institute-on-Improvements-to-Generator-Interconnection-Procedures-and-Agreements.pdf
33 National Petroleum Council, Permitting Reform Report, 2025, https://permitting.npc.org/files/2025_Permitting_Report.pdf.
34 Ibid.
35 U.S. Energy Information Administration, “Electric power sector emissions,” accessed 2025, https://www.eia.gov/todayinenergy/detail.php?id=65744.
36 Hammad, Omar M. Air Quality: EPA’s 2023 Proposed Changes to the Particulate Matter (PM) Standard. Congressional Research Service Report R47652, August 16, 2023, https://www.congress.gov/crs-product/R47652.
37 Lawrence Berkeley National Laboratory, Queued Up: Interconnection Queue Trends 2025, December 15, 2025, and Ryan Wiser et al., “Factors Influencing Recent Trends in Retail Electricity Prices in the United States,” The Electricity Journal 38, no. 4 (October 2025), https://emp.lbl.gov/publications/factors-influencing-recent-trends.
38 Tax Foundation, “Section 232 Tariffs on Steel and Aluminum,” 2024, https://taxfoundation.org/research/all/federal/section-232-tariffs-steel-aluminum-2024/.
39 PwC, “US Reciprocal and Current Tariffs: Potential Impact for the Energy, Utilities, and Resources Industry,” May 9, 2025, https://www.pwc.com/us/en/services/tax/library/pwc-us-tariff-industry-analysis-energy-utilities-resources.html
40 Mary Amiti, Chris Flanagan, Sebastian Heise, and David E. Weinstein, “Who Is Paying for the 2025 U.S. Tariffs?” Federal Reserve Bank of New York, Liberty Street Economics, February 12, 2026, https://libertystreeteconomics.newyorkfed.org/2026/02/who-is-paying-for-the-2025-u-s-tariffs/
41 Grid Strategies, National Load Growth Report 2025, https://gridstrategiesllc.com/wp-content/uploads/Grid-Strategies-National-Load-Growth-Report-2025.pdf.
42 See, for example, Johannes Pfeifenberger, Interregional Transmission Planning with HVDC, The Brattle Group, March 2024, https://www.brattle.com/wp-content/uploads/2024/03/Interregional-Transmission-Planning-with-HVDC.pdf
43 Nick Loris, “From Commitment to Investment: Amazon Bets Big on Advanced Nuclear,” C3 Newsmag, October 18, 2025, https://c3newsmag.com/from-commitment-to-investment-amazon-bets-big-on-advanced-nuclear/.
44 Jonathan Mattise, “Utility to Buy Power from Advanced Nuclear Plant to Fuel Tennessee and Alabama Google Data Centers,” Associated Press, August 18, 2025, TVA to power Google data centers with advanced nuclear plant | AP News.
45 Utility Dive, “Data centers raise grid reliability concerns,” 2025, https://www.utilitydive.com/news/data-center-grid-reliability-ferc-nerc/803467/.
46 See “The Capacity Commitment Framework, the-capacity-commitment-framework-ccf-jan.pdf.
47 See, Amanda Peterson Corio, “Supporting the White House Ratepayer Protection Pledge: Google’s approach for responsible energy growth,” March 4, 2026, Google’s approach for responsible energy growth. Google was also a pioneer in expanding the concept of demand response as a reaction to high energy prices in Europe during late 2022 and 2023. See, for example, Varun Mehra and Raiden Hasegawa, “Supporting power grids with demand response at Google data centers,” October 3, 2023, Using demand response to reduce data center power consumption | Google Cloud Blog.
48 Dominion Energy and Duke Energy, “Dominion Energy and Duke Energy Cancel the Atlantic Coast Pipeline,” press release, July 5, 2020; Federal Energy Regulatory Commission, Atlantic Coast Pipeline, LLC, Certificate Order, 161 FERC ¶ 61,042 (October 13, 2017); Russell Gold and Dan Molinski, “Atlantic Coast Pipeline Is Canceled as Costs Mount,” Wall Street Journal, July 5, 2020
49 Federal Energy Regulatory Commission, Constitution Pipeline Company, LLC, Order Issuing Certificate and Approving Abandonment, 154 FERC 61,046 (January 28, 2016); New York State Department of Environmental Conservation, “DEC Denies Water Quality Certification for Constitution Pipeline,” April 22, 2016; Williams Companies, “Williams Announces Cancellation of Constitution Pipeline Project,” press release, February 24, 2020.
50 Jones Act Forces New England to Import Gas From Russia,” Global Trade Magazine, January 24, 2018, https://www.globaltrademag.com/jones-act-forces-new-england-import-gas-russia/
51 U.S. Energy Information Administration, “Petroleum Electricity Generation Surpassed Natural Gas in New England During Winter Storm,” Today in Energy, January 29, 2026, https://www.eia.gov/todayinenergy/detail.php?id=67104
52 New Hampshire Site Evaluation Committee, Order Denying Application for Certificate of Site and Facility for Northern Pass Transmission LLC, February 1, 2018; Appeal of Northern Pass Transmission LLC, 173 N.H. 141 (New Hampshire Supreme Court, July 19, 2019); Eversource Energy, “Eversource Comments on New Hampshire Supreme Court Decision Regarding Northern Pass,” investor statement, July 25, 2019.
53 H.R. 4776, SPEED Act, 119th Cong. (2025), https://www.congress.gov/bill/119th-congress/house-bill/4776, H.R. 3898, Promoting Efficient Review for Modern Infrastructure Today Act (PERMIT Act), 119th Cong. (2025), https://www.congress.gov/bill/119th-congress/house-bill/3898, and U.S. House Committee on Energy and Commerce, “Full Committee Markup Recap: E&C Advances 11 Bills to the Full House of Representatives,” January 21, 2026, https://energycommerce.house.gov/posts/full-committee-markup-recap-e-and-c-advances-11-bills-to-the-full-house-of-representativehighers
54 Loris, Nick. Testimony before the House Committee on Natural Resources, Subcommittee on Energy and Mineral Resources, hearing on H.R. 4776, the “Standardizing Permitting and Expediting Economic Development (SPEED) Act.” September 10, 2025. C3 Solutions. https://c3solutions.org/wp-content/uploads/2025/09/Loris_HR4776_Testimony_FINAL.pdf
55 Promoting Efficient Review for Modern Infrastrucevaluateratherrather than to, thanture Today Act (PERMIT Act), H.R. 3898, 119th Cong. (2025), https://www.congress.gov/bill/119th-congress/house-bill/3898; Roland, Samuel, and Daniel King. Grid Policy for the AI Demand Surge. Foundation for American Innovation, 2026, page 31-32, https://www.thefai.org/posts/grid-policy-for-the-ai-demand-surge.
56 Daren Bakst, “Modernizing Air Regulation,” Competitive Enterprise Institute, March 6, 2025, https://cei.org/publication/epa-modernizing-air-regulation/
57 Nick Loris, “The Clean Air Act Needs a Regulatory Face-Lift,” The National Interest, December 9, 2025, https://nationalinterest.org/blog/energy-world/the-clean-air-act-needs-a-regulatory-face-lift
58 Arthur G. Fraas, John D. Graham, and Jeffrey Holmstead, “EPA’s New Source Review Program: Time for Reform?” Environmental Law Reporter 47, no. 1 (January 2017), https://www.rff.org/publications/journal-articles/epas-new-source-review-program-time-for-reform/
59 Property and Environment Research Center (PERC), A Field Guide for Wildlife Recovery (Bozeman, MT: PERC, September 2023), https://www.perc.org/wp-content/the speed anduploads/2023/09/PERC_Field-Guide-for-Wildlife-Recovery.pdf
60 Cecilia Fassett, Modernizing Section 106 of the National Historic Preservation Act: Restoring Clarity, Predictability, and Purpose to the Process (Washington, DC: C3 Solutions, April 2026), https://c3solutions.org/wp-content/uploads/2026/04/NHPA-WhitePaper-v4.pdf
61 Gov. Cox Signs Executive Order to Streamline Permitting and Empower Utahns to Build a Future of Abundance,” Office of the Governor, State of Utah, press release, December 16, 2024, https://governor.utah.gov/press/gov-cox-signs-e, which,xecutive-order-to-streamline-permitting-and-empower-utahns-to-build-a-future-of-abundance/
62 Devin Hartman, “The Case for Easing Regulations on Electricity Generation,” R Street Institute, July 30, 2025, originally published in Governing, https://www.rstreet.org/commentary/the-case-for-easing-regulations-on-electricity-generation/; Devin Hartman, “How to Liberate Electric Power.” National Affairs, 2025. https://www.nationalaffairs.com/publications/detail/how-to-liberate-electric-power.
63 Carlos Martinez, “Who Signs Off on Major Rules? New Push in Congress Could Shift Power from Bureaucrats,” National Taxpayers Union Foundation, July 10, 2025, https://www.ntu.org/foundation/detail/who-signs-off-on-major-rules-new-push-in-congress-could-shift-power-from-bureaucrats.
64 Vinson & Elkins LLP, “Data Centers and the Grid,” 2024, https://www.vnf.com/1103.
65 Jennifer Chen and Devin Hartman, “Why Wholesale Market Benefits Are Not Always Apparent in Customer Bills,” R Street Institute, November 10, 2021, https://www.rstreet.org/commentary/why-wholesale-market-benefits-are-not-always-apparent-in-customer-bills/
66 Michael Giberson and Devin Hartman, Electric Paradigms: Competitive Structures Benefit Consumers, R Street Policy Study No. 293: R Street Institute, September 2023, https://www.rstreet.org/wp-content/uploads/2023/09/FINAL_r-street-policy-study-no-293.pdf
67 Chris Villarreal, Kent Chandler, and Michael Giberson, State-By-State Scorecard on Electricity Competition, R Street Institute, May 22, 2025, https://www.rstreet.org/research/state-by-state-scorecard-on-electricity-competition/
68 Eric Olson, Jack Dorminey, and Jason M. Walter, “The Hidden Tax on Your Power Bill: Construction Work in Progress,” Manhattan Institute, December 4, 2025, https://manhattan.institute/article/the-hidden-tax-on-your-power-bill-construction-work-in-progress
69 Ibid.
70 Toohill, Spencer, Guide to NRC Application Data Requirements for State Implementers. Oakland, CA: Breakthrough Institute, October 2025. https://thebreakthrough.imgix.net/pdfs/BTI-Guide-to-NRC-Application-Data-Requirements-for-State-Implementers.pdf
71 U.S. Nuclear Regulatory Commission, “Agreement State Program,” last modified February 15, 2023, https://www.nrc.gov/about-nrc/state-tribal/agreement-states
72 Devin Hartman, Kent Chandler, and Beth Garza, Twelve Policy Priorities to Secure Bulk Electric Reliability, R Street Policy Study No. 322, R Street Institute, May 13, 2025, https://www.rstreet.org/research/twelve-policy-riorities-to-secure-bulk-electric-reliability
73 IEEE, “High-Power Data Center Integration,” 2022, https://ieeexplore.ieee.org/document/9999040/; U.S. Energy Information Administration, “Electricity explained,” https://www.eia.gov/todayinenergy/detail.php?id=63344.
74 This “strike price” varies by miner, but is often informally referenced as between $75 and $150 per megawatt hour.mattermegawatt-hour
75 Turner Loesel and Josh T. Smith. Digital Foundations: The Essential Guide to Data Centers and Their Growth. James Madison Institute, 2025, https://jamesmadison.org/digital-foundations-the-essential-guide-to-data-centers-and-their-growth/
76 Tyler H. Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell. 2025. Rethinking Load Growth: Assessing the Potential for Integration of Large Flexible Loads in US Power Systems. Duke University Nicholas Institute for Energy, Environment, and Sustainability. https://nicholasinstitute.duke.edu/publications/rethinking-load-growth.
77 Seasonal concerns matter, but can be deceptive. Unexpected peaks also happen in mild seasons if generators are down for maintenance on an unexpectedly hot or cool day. These are often called shoulder seasons. If reserves drop low, then conservation requests may be necessary or even emergency operations. Both raise costs for consumers. One example is the September 6, 2023, event in Texas where an unusually hot day with low wind production and a setting sun created an evening with low reserves and triggered a conservation request and emergency operations. The real lesson is that many forms of flexibility have value because there are many failure cases grid operators prepare for to keep the lights on.
78 In fact, a recent Lawrence Berkeley National Laboratory paper shows that areas with load growth saw falling retail rates, rather than rising retail rates: https://emp.lbl.gov/publications/factors-influencing-recent-trends.
79 For example, Texas Bitcoin miners participate in a controllable load resource program. Grid operators consider them as competitors with generation and can dispatch them to serve grid needs more economically. These were updates supported by miners and other industries interested in using loads as resources. Lee Bratcher, “Texas Blockchain Council Announces Alignment with Grid Reliability Standards,” Texas Blockchain Council, 2025, https://texasblockchaincouncil.org/blog/texas-blockchain-council-announces-alignment-with-grid-reliability-standards; “Load Resource Participation in the ERCOT Market,” https://www.ercot.com/services/programs/load/laar
80 Brad Smith, “With Our Latest Energy Deal, Microsoft’s Cheyenne Datacenter Will Now Be Powered Entirely by Wind Energy, Keeping Us on Course to Build a Greener, More Responsible Cloud,” Microsoft on the Issues (blog), November 14, 2016, https://blogs.microsoft.com/on-the-issues/2016/11/14/latest-energy-deal-microsofts-cheyenne-datacenter-will-now-powered-entirely-wind-energy-keeping-us-course-build-greener-responsible-cloud/
81 Josh T. Smith (@smithtjosh), “This is a great @CatalystPod episode on the kinds of regulatory innovations needed to unlock flexible load . . .,” X (formerly Twitter), July 17, 2024, https://x.com/smithtjosh/status/1813977990155682139
82 Bridge to Grid: What Is Happening to Grid Power for America’s New Large-Load Customers? Enchanted Rock, 2025, https://enchantedrock.com/bridge-to-grid/
83 Customers also benefit because the systems are kept in working order at all times to participate in these markets. Unlike, say, your snowblower that hasn’t been touched since last season and doesn’t start.
84 Sara Sanchez, “Gas plant proposed to power data center,” El Paso Inc., January 25, 2026, https://www.elpasoinc.com/news/local_news/gas-plant-proposed-to-power-data-center/article_f112081b-a884-4a8d-a1e9-199db4b00fbf.html. context the
85 “El Paso Electric Company’s Response to City of El Paso’s First Requests for Information Question Nos. CEP1-1 Through CEP 1-32,” February 19, 2026, https://interchange.puc.texas.gov/Documents/59076_46_1589383.PDF. Public details of Meta’s project with El Paso Electric are available via the Public Utility Commission of Texas’s docket 59076: https://interchange.puc.texas.gov/search/filings/?UtilityType=E&ControlNumber=59076&ItemMatch=Equal&DocumentType=ALL&SortBy=FilingParty&SortOrder=Descending
86 Some see off-grid and colocated solar, storage, and gas data centers as already economical when compared to other strategies. See: https://www.offgridai.us/.
87 Base Power, “About,” accessed February 2026, https://www.basepowercompany.com/about.
88 Kiesling, Laura Lynne, and Andrew N. Kleit, eds. Electricity Restructuring: The Texas Story. AEI Press, 2009; The Texas grid operator also facilitates working groups around important topics, like leveraging demand-side resources. See: https://www.ercot.com/committees.
89 See figure ES-5: David Feldman, Vignesh Ramasamy, Ran Fu, Ashwin Ramdas, Jal Desai, and Robert Margolis, “U.S. Solar Photovoltaic System and Energy Storage Cost Benchmark: Q1 2020,” National Renewable Energy Laboratory, https://docs.nlr.gov/docs/fy21osti/77324.pdf
90 “Solar Permitting, Inspection, and Interconnection, Timelines,” National Laboratory of the Rockies, https://www.nlr.gov/solar/market-research-analysis/permitting-inspection-interconnection-timelines
91 Florida, Texas, New Hampshire, and Tennessee all have such policies. As an example, the Manhattan Institute’s model bill is one option: “A Model Bill to Allow Independent Permitting and Inspections,” https://manhattan.institute/article/a-model-bill-to-allow-independent-permitting-and-inspections
92 Maeve Allsup, “Nvidia and Oracle Tapped This Startup to Flex a Phoenix Data Center,” Latitude Media, July 1, 2025, https://www.latitudemedia.com/news/nvidia-and-oracle-tapped-this-startup-to-flex-a-phoenix-data-center/
93 Paul Ciampoli, “NVIDIA, Emerald AI, EPRI, PJM and Others to Develop Power-Flexible AI Factory,” Public Power, October 30, 2025, https://www.publicpower.org/periodical/article/nvidia-emerald-ai-epri-pjm-and-others-develop-power-flexible-ai-factory.
94 For example, in March 2026, Google announced that its long-term contracts with Entergy Arkansas, Minnesota Power, Indiana Michigan Power, and other utilities had reached a combined 1 gigawatt of data center demand response, “helping to stabilize the grid during certain hours or times of the year.” See, Michael Terrell, “A new milestone for smart, affordable electricty growth,” March 19, 2026, https://blog.google/innovation-and-ai/infrastructure-and-cloud/global-network/demand-response-data-center-milestone/.
95 Tyler H. Norris, Tim Profeta, Dalia Patino-Echeverri, and Adam Cowie-Haskell. 2025. Rethinking Load Growth: Assessing the Potential for Integration of Large Flexible Loads in US Power Systems. Duke University Nicholas Institute for Energy, Environment, and Sustainability, page 3, https://nicholasinstitute.duke.edu/publications/rethinking-load-growth.
96 Kiesling, Lynne. Innovations and Decentralized Energy Markets. Policy Paper. The Center for Growth and Opportunity at Utah State University, 2020. https://www.thecgo.org/research/innovations-and-decentralized-energy-markets/.
97 Eric Gimon, Mark Ahlstrom, and Mike O’Boyle, “Energy Parks: A New Strategy to Meet Rising Electricity Demand,” Energy Innovation, 2024, https://energyinnovation.org/wp-content/uploads/Energy-Parks-Report.pdf.
98 “FERC Directs Nation’s Largest Grid Operator to Create New Rules to Embrace Innovation and Protect Consumers,” Federal Energy Regulatory Commission, December 18, 2025, https://www.ferc.gov/news-events/news/fact-sheet-ferc-directs-nations-largest-grid-operator-create-new-rules-embrace?new=
99 “Secretary of Energy’s letter re the Interconnection of Large Loads Pursuant to the Secretary’s Authority under Section 403 of the Department of Energy Organization Act and Advance Notice of Proposed Rulemaking under RM26-4,” 2025, https://elibrary.ferc.gov/eLibrary/filelist?accession_number=20251027-4001&optimized=false&sid=4fd48919-3af9-4518-a67e-3d4d26486228
100 For instance, the R Street Institute’s initial comments to DOE’s ANOPR stressed that, before FERC expands federal authority over large-load interconnections, the Commission should focus first on core reliability and resource adequacy risks, warning that simply speeding up load hookups does not ensure there will be power available to serve them. ANOPR could trigger unnecessary federal “jurisdiction creep” and litigation risk if FERC moves too aggressively without demonstrating that existing tariffs are unjust or inadequate. R Street Institute. Initial Comments on the Interconnection of Large Loads to the Interstate Transmission System Filed Before the Federal Energy Regulatory Commission. November 21, 2025. https://www.rstreet.org/outreach/initial-comments-on-the-interconnection-of-large-loads-to-the-interstate-transmission-system-filed-before-the-federal-energy-regulatory-commission/
101 Gideon Powell, Josh T. Smith, “How AI data centers can support grid reliability in Texas and across the US,” Utility Dive, April 22, 2025, https://www.utilitydive.com/news/ai-data-centers-colocation-grid-reliability-interconnection-texas/745176/
102 Powell and Smith, “How AI data centers can support grid reliability,” Utility Dive, 2025; Arushi Sharma Frank, “Two Bills in Texas for Data Center Speed to Power,” April 7, 2025, https://www.linkedin.com/pulse/two-bills-texas-data-center-speed-power-guess-which-1zabe/?trackingId=LbhpzU1MQEusfhHB5u2wLQ%3D%3D
103 Andrew Levitt, Johnannes Pfeifenberger, Aniruddh Mohan, and Serena Patel, “Proposed Options for Bilateral Integration of Generation Portfolios and Load (BIGPAL),” September 26, 2025, Brattle, https://www.brattle.com/wp-content/uploads/2025/10/Proposed-Options-for-Bilateral-Integration-of-Generation-Portfolios-and-Load-BIGPAL.pdf
104 U.S. Energy Information Administration, “After more than a decade of little change, U.S. electricity consumption is rising again,” Today in Energy, May 13, 2025, https://www.eia.gov/todayinenergy/detail.php?id=65264
105 Robert Walton, “US electricity demand will grow 50% by 2050, electrical manufacturer study finds,” Utility Dive, April 7, 2025, https://www.utilitydive.com/news/us-electricity-demand-will-grow-50-by-2050-electrical-manufacturer-study/744575/
106 Andrew Satchwell et al., Electricity Rate Designs for Large Loads: Evolving Practices and Opportunities (Berkeley, CA: Lawrence Berkeley National Laboratory, January 2025), https://eta-publications.lbl.gov/sites/default/files/2025-01/electricity_rate_designs_for_large_loads_evolving_practices_and_opportunities_final.pdf
107 Federal Energy Regulatory Commission, “Interconnection of Large Loads to the Interstate Transmission System, Docket No. RM26-4-000,” accessed February 21, 2026, https://www.ferc.gov/rm26-4
108 Travis Fisher and Glen Lyons, The Case for Consumer-Regulated Electricity: Private Electricity Grids Offer a Parallel Path to Energy Abundance, Cato Institute Briefing Paper, February 3, 2026, https://www.cato.org/briefing-paper/case-consumer-regulated-electricity-private-electricity-grids-offer-parallel-path
109 Joule, “Powering the Future of AI and High-Performance Computing,” accessed February 21, 2026, https://joulepower.ai/
110 Ryan Quint, Jiecheng (Jeff) Zhao, and Kyle Thomas, An Assessment of Large Load Interconnection Risks in the Western Interconnection (Salt Lake City, UT: Western Electricity Coordinating Council, February 2025), https://www.wecc.org/sites/default/files/documents/products/2025/Report_WECC%20Large%20Loads%20Risk%20Assessment%204.pdf
111 Readers are encouraged to access additional case studies of policy “dos and don’ts” at the state level by accessing NTU’s AI and Data Center Toolkit archive at www.ntu.org.
112 U.S. Department of Energy, Grid Deployment Office, “Speed to Power Initiative,” November 4, 2025, https://www.energy.gov/gdo/articles/speed-power-initiative
113 Ethan Howland, “New FERC commissioners say connecting data centers is key priority,” Utility Dive, November 21, 2025, https://www.utilitydive.com/news/ferc-data-centers-swett-lacerte-lng/806145/
114 Senator Tom Cotton, “Cotton Introduces Bill to Lower Energy Costs for Arkansans,” press release, January 8, 2026, https://www.cotton.senate.gov/news/press-releases/cotton-introduces-bill-to-lower-energy-costs-for-arkansans
115 Senator Josh Hawley, “Hawley, Blumenthal Introduce Bill to Prevent Data Centers from Increasing Electricity Costs for Americans,” press release, February 12, 2026, https://www.hawley.senate.gov/hawley-blumenthal-introduce-bill-to-prevent-data-centers-from-increasing-electricity-costs-for-americans/
116 Michael Kuser, “Tech Companies Hedge Against Worrisome Grid Politics,” RTO Insider, December 9, 2025, https://www.rtoinsider.com/125218-tech-companies-hedge-against-worrisome-grid-politics/
117 National Conference of State Legislatures, “Policy Snapshot: Data Center Incentives,” updated November 17, 2025, https://www.ncsl.org/fiscal/policy-snapshot-data-center-incentives
118 Alex Fitzpatrick, America’s Data Center Growth Hot Spots, Mapped (2025), https://www.axios.com/2025/12/18/data-center-growth-map-states
119 Pranshu Verma and Shelly Tan, “A Bottle of Water per Email: The Hidden Environmental Costs of Using AI Chatbots,” The Washington Post, September 18, 2024, https://www.washingtonpost.com/technology/2024/09/18/energy-ai-use-electricity-water-data-centers/
120 Felicia Chiang et al., “Evidence of Anthropogenic Impacts on Global Drought Frequency, Duration, and Intensity,” Nature Communications 12, no. 1 (2021): 1, https://doi.org/10.1038/s41467-021-22314-w
121 Junguo Liu et al., “Water Scarcity Assessments in the Past, Present, and Future: REVIEW ON WATER SCARCITY ASSESSMENT,” Earth’s Future 5, no. 6 (2017): 545–59, https://doi.org/10.1002/2016EF000518; Brian D. Richter et al., “Water Scarcity and Fish Imperilment Driven by Beef Production,” Nature Sustainability 3, no. 4 (2020): 319–28, https://doi.org/10.1038/s41893-020-0483-z
122 1 AF = 1,233,000,000 ml; one water bottle = 500 ml.
123 OpenET, “OpenET,” accessed February 24, 2026, https://etdata.org/.
124 Arman Shehabi et al., United States Data Center Energy Usage Report, LBNL--1005775, 1372902 (Berkeley Lab, 2024), LBNL--1005775, 1372902, https://doi.org/10.2172/1372902
125 Brian D. Richter et al., “New Water Accounting Reveals Why the Colorado River No Longer Reaches the Sea,” Communications Earth & Environment 5, no. 1 (2024): 134, https://doi.org/10.1038/s43247-024-01291-0
126 Heather Cooley, California Agricultural Water Use: Key Background Information (Pacific Institution, 2015), https://pacinst.org/wp-content/uploads/2015/04/CA-Ag-Water-Use.pdf
127 Jerry L. Hatfield, “Water Requirements of an Alfalfa Crop in California,” University of California, Davis, 1975, https://alfalfasymposium.ucdavis.edu/+symposium/proceedings/1975/75-78.pdf
128 California Department of Food and Agriculture, California Agricultural Statistics Review 2023-2024, 2024.
129 A conservative estimate, along with using an average farm size, adjusts for the fact that not all crops being grown are alfalfa.
130 Brian Potter, “How Does the US Use Water?” Construction Physics, August 21, 2025, https://www.construction-physics.com/p/how-does-the-us-use-water.
131 Kevin Heslin, “Ignore Data Center Water Consumption at Your Own Peril,” Uptime Institute Blog, June 17, 2016, https://journal.uptimeinstitute.com/dont-ignore-water-consumption/
132 Heslin, “Ignore Data Center Water Consumption at Your Own Peril.”
133 Philip Womble and W. Michael Hanemann, “Legal Change and Water Market Transaction Costs in Colorado,” Water Resources Research 56, no. 4 (2020): e2019WR025508, https://doi.org/10.1029/2019WR025508; Philip Womble and W. Michael Hanemann, “Water Markets, Water Courts, and Transaction Costs in Colorado,” Water Resources Research 56, no. 4 (2020), https://doi.org/10.1029/2019WR025507
134 Gary Libecap, “Transaction Costs, Property Rights, and the Tools of the New Institutional Economics: Water Rights and Water Markets,” in New Institutional Economics, ed. Eric Brousseau and Jean-Michel Glachant (Cambridge University Press, 2008), https://doi.org/10.1017/CBO9780511754043.016; Gary Libecap, “Institutional Path Dependence in Climate Adaptation: Coman’s ‘Some Unsettled Problems of Irrigation,’” American Economic Review, no. 101 (February 2011): 64–80.
135 Nina B. Elkadi, In the Arizona Desert, Where Your Neighbor Is an Alfalfa Farm, Climate, October 6, 2025, 313, https://sentientmedia.org/arizona-desert-where-your-neighbor-is-an-alfalfa-farm/; Ian James, “Hay Grown for Cattle Consumes Nearly Half the Water Drawn from Colorado River, Study Finds,” Climate & Environment, Los Angeles Times, March 28, 2024, https://www.latimes.com/environment/story/2024-03-28/alfalfa-hay-beef-water-colorado-river
136 Michael Copley, “Data Centers, Backbone of the Digital Economy, Face Water Scarcity and Climate Risk,” Climate, NPR, August 30, 2022, https://www.npr.org/2022/08/30/1119938708/data-centers-backbone-of-the-digital-economy-face-water-scarcity-and-climate-ris
137 Peter Debaere and Tianshu Li, “Water Markets’ Promise: The Murray–Darling Basin,” Environmental Research Letters 17, no. 12 (2022): 125003, https://doi.org/10.1088/1748-9326/aca343; Sanchari Ghosh, “Droughts and Water Trading in the Western United States: Recent Economic Evidence,” International Journal of Water Resources Development 35, no. 1 (2019): 145–59, https://doi.org/10.1080/07900627.2017.1411252; Bryan Leonard et al., “Expanding Water Markets in the Western United States: Barriers and Lessons from Other Natural Resource Markets,” Review of Environmental Economics and Policy 13, no. 1 (2019): 43–61, https://doi.org/10.1093/reep/rey014
138 Leonardo Nicoletti et al., “The AI Boom Is Draining Water From the Areas That Need It Most,” Bloomberg, May 8, 2025, https://www.bloomberg.com/graphics/2025-ai-impacts-data-centers-water-data/; Felicity Barringer, “Thirsty for Power and Water, AI-Crunching Data Centers Sprout across the West,” Economic Development & the West, & the West, April 8, 2025, https://andthewest.stanford.edu/2025/thirsty-for-power-and-water-ai-crunching-data-centers-sprout-across-the-west/
139 Katherine Wright, Utah’s Moonshot (2025), https://perc.org/2025/03/27/utahs-moonshot/
140 Peter W. Culp et al., Shopping for Water: How the Market Can Mitigate Water Shortages in the American West (Stanford Woods Institute for the Environment, 2014), 1–36.