AI's electricity problem is real — and could make average Americans hate AI
Oct 2, 2025
Key Points
- US household electricity bills rose 25% between 2021 and 2024, with JP Morgan attributing 70% of last year's increases to data center demand for AI infrastructure.
- Data centers consuming 40% of Virginia's electricity and concentrating in a handful of states creates a political vulnerability as homeowners subsidize AI infrastructure through higher rates.
- OpenAI's Sam Altman projects needing 250 gigawatts of capacity by 2033—half of current total US generation—but only actual buildout of nuclear and solar prevents price spikes that could trigger public backlash.
Summary
AI's electricity consumption is becoming a tangible threat to public sentiment toward the technology, and it is already showing up in the cultural conversation.
A viral post from September by user Chipotle Rowan received 300,000 likes on X by complaining that his electric bill rose 25% so other people could generate AI videos. The math checks out. Between 2021 and 2024, average US household electricity bills rose from $1,452 to $1,728—exactly a 25% increase. From 2010 to 2020, electricity prices stayed stable at roughly 12 cents per kilowatt-hour. After 2020, when AI investments ramped up, prices began climbing steadily. JP Morgan estimated that 70% of last year's electricity cost increases came from data center demand.
Consumers have not yet made the trade-off calculation between AI's utility and the cost it is imposing on their bills. That changes fast if they do. The social media industry survived a similar reckoning, with critics still using social platforms despite opposing them. AI-generated content carries different baggage. AI video generation feels frivolous in a way that social feeds do not. If people perceive they are subsidizing expensive AI output while their power bills spike, the backlash could accelerate faster than it did for social media. The NFT comparison is apt: public tolerance of blockchain collapsed once celebrities started hawking million-dollar monkey pictures.
Data center capacity buildout
Data centers consumed 4.5% of US electricity in 2023 and are on track to hit 6% within years. By late 2024, 6.5 gigawatts of new capacity was underway, a 1-2% increase from the roughly 500 gigawatts America currently produces. Plans now exist for up to 100 gigawatts of buildout over the next few years, which would represent a 20% increase in total US capacity, though most remain at the letters-of-intent stage.
Sam Altman has outlined 250 gigawatts of capacity at OpenAI by 2033. America as a whole produced 485 gigawatts in 2023. Altman is describing a need for half of total current US power generation. The only way to avoid that demand crushing existing grids and pushing prices skyward is to actually build the infrastructure Altman is describing, not borrow it from the grid.
Utility companies are already pushing back. Georgia Power now requires $600 million upfront deposits against 10-year power agreements for 500 megawatt requests, with monthly withdrawals. Alabama Power mandates customers commit to using 90% of requested power. These barriers respond to companies requesting massive capacity they do not intend to use, treating access to power as a strategic asset.
Geographic concentration and political risk
Data centers went from consuming less than 5% to roughly 40% of Virginia's electricity between 2010 and 2025. The pattern holds in Texas, Mississippi, and Alabama. Nationally, data centers are still at 4-5%, but concentrated buildout means certain states are already experiencing the pain. Single-family homeowners and small businesses in those states are effectively subsidizing data center infrastructure costs through higher electricity rates. That becomes a political time bomb if it continues to concentrate.
Two paths forward exist. One is that the tech industry, energy sector, and AI companies actually build the nuclear and solar capacity they are planning. If AI demand follows a sigmoid curve rather than exponential growth, America overbuilds capacity and electricity prices fall, a positive outcome echoing the post-dotcom internet overbuild that created cheap broadband for new tech companies. The other is that algorithmic efficiency improvements, similar to Ethereum's shift from proof-of-work to proof-of-stake, make AI cheaper to run. Ethereum's energy efficiency gains actually hurt Nvidia's GPU sales that year because blockchain companies needed fewer chips.
What is missing from both scenarios is consumer awareness that they are already paying the electricity bill. Once that becomes mainstream knowledge, the margin for error shrinks.