Public backlash toward data centers is escalating. Activist groups gather to voice concerns, and in some cities, council members are being voted out of office for approving data center construction. Extreme resistance includes an Indianapolis councilman receiving 13 bullet holes through his front door and OpenAI CEO Sam Altman’s San Francisco home targeted with a Molotov cocktail and gunfire.
More data centers have been rejected or canceled in the first quarter of 2026 alone than in 2025. Communities express concerns regarding noise levels, potential tax breaks, water usage, the advancement of artificial intelligence (AI), and electricity demand. Each issue deserves its own conversation. But on the energy front, growing evidence is disproving any correlation between data centers and high electric bills; in some cases, data centers are lowering rates.
It is true that data centers—especially those powered by AI—consume massive amounts of energy and are largely driving the rapid surge in electricity demand. Small and medium-sized centers utilize 1-5 megawatts (MW) of power daily; hyperscale centers operated by Big Tech companies like Microsoft and Google generally consume 20-100 MW, enough to power an entire city.
One might intuitively infer that adding large loads equals higher electricity rates. Politicians and media outlets certainly push the narrative. But the assumption is ill-placed and unfounded.
Virginia hosts 13% of the world’s data centers and 70% of global internet traffic with an astounding 663 centers. Yet, electricity rates remain stable.
Roughly 25% of the Commonwealth’s electricity consumption comes from data center customers, but the 15.94 cents/kilowatt hour (kWh) average residential rate is below the nation’s average of 17.24 cents/kWh. Dominion Energy maintains that recent modest rate increases are “largely attributed to inflationary pressure, not the demand of data centers.”
Even a Virginia state-commissioned report found residential rate payers were not subsidizing costs for larger users, though it warns future demand will require significant new generation and transmission.
Texas, coming in second with 405 data centers, also maintains an average electricity rate below the national average: 16.05 cents/kWh.
A February Charles River Associates study found that U.S. retail electricity rates have generally tracked with inflation and are not primarily driven by data center development, protecting most consumers from cost increases. Increases are mostly due to operating expenses.
Lawrence Berkeley National Lab (LBNL) published a study in December finding that data centers are not the primary cause of rising electricity rates. Instead, data centers can lower electricity rates by spreading high fixed grid infrastructure costs across more customers.
Indeed, several states are experiencing lower rates amidst bulging demand.
Data center growth helped California, of all places. The Golden State may be known for the highest electricity rates in the contiguous U.S., but one of its major power companies Pacific Gas & Electric (PG&E) has reduced rates 11% since 2024, attributing the cuts to data center growth. Other factors keep rates high, but increased demand applies downward pressure. PG&E asserts they can lower electric bills by roughly one percent for each gigawatt of new load.
In North Dakota, utilities which saw the largest load growth over the last few years experienced the largest overall-average reduction in prices. LBNL attributes such gains to the spreading of fixed costs and an abundance of low-priced energy.
Ironically, northeastern states have much fewer data centers yet have experienced the largest electricity rate hikes. State policies and lack of generational capacity are the main culprits, not accelerating load growth. Because utilities do not own their generation, they are susceptible to higher wholesale electricity markets. Ambitious net-zero policies like renewable portfolio standards, Regional Greenhouse Gas Initiative participation, and blocking natural gas pipelines are also major contributors.
The primary mechanism behind the electricity price drops amid data center expansion trend—as seen in multiple states—is utilities’ ability to distribute the high fixed costs of building and maintaining grid infrastructure across a higher total consumption of electricity. As demand rises, those expenses spread across more kilowatt hours, lowering the cost per unit.
Data centers add a lot of steady, predictable load. Consistent round-the-clock electricity consumption at high volumes sells more kilowatt-hours, bringing more stable revenue for the utility. A lower cost per unit of electricity results. Grids operate more efficiently because demand spikes become relatively smaller compared to demand overall, reducing prices for everyone.
Some Big Tech companies are opting to go behind the meter and power their facilities, which prudently avoids ongoing energy controversies and puts them online faster. However, more open and honest discussions are needed among state leaders, lawmakers, and their constituents about data centers, energy policy, and electricity rates.
Every major emerging technology has required more energy and infrastructure. Everything digital runs through data centers, making them foundational, not optional. Questions should center on their undertaking rather than their elimination.
Kristen Walker is Senior Policy Analyst and Manager for Energy and Transportation with the American Consumer Institute, a nonprofit education and research organization. For more information about the Institute, visit www.theamericanconsumer.org or follow us on X @ConsumerPal