Interview

Simon Eskildsen X timeline: energy costs near data centers up 267%, OpenAI's $2.5B cash burn, and AI bubble debate

Sep 30, 2025 with Simon Hørup Eskildsen

Key Points

  • Wholesale electricity prices near data centers have jumped 267% over five years, with JP Morgan attributing 70% of broader electricity cost increases to data center construction, creating real economic pressure on AI infrastructure expansion.
  • OpenAI burned $2.5 billion in cash on $4.3 billion in revenue in the first half of 2025, tracking toward $8.5 billion annual burn, but Eskildsen treats the rate as defensible given the scale of opportunity and real transactional revenue.
  • Nvidia is extending interest-free loans and guaranteeing token demand to smaller GPU buyers like Bitcoin miners and independent data center operators, diversifying revenue concentration among hyperscalers while betting others will operate the hardware if buyers can't.
Simon Eskildsen X timeline: energy costs near data centers up 267%, OpenAI's $2.5B cash burn, and AI bubble debate

Summary

Simon Eskildsen, founder of Turbopuffer, joined the show to work through several AI infrastructure and business model questions — from energy costs to fundraising discipline.

Energy costs near data centers

Wholesale electricity prices in areas near data centers have risen 267% over the past five years, according to Bloomberg, with those costs being passed directly to consumers. A JP Morgan analysis cited separately attributes 70% of broader electricity cost increases to data center construction. The public debate has focused on water usage, which Eskildsen treats as largely overblown, while the power consumption story is the one with real economic teeth. Eskildsen notes that fluctuating grid demand from AI workloads also degrades power quality in ways that accelerate wear on electronics — an underappreciated downstream cost. The silver lining, in his view, is that higher prices create market incentives to build more generation capacity, whether nuclear, hydro, or gas recovery. He points to neoclouds that originated as Bitcoin miners capturing stranded natural gas energy as an early example of that dynamic.

OpenAI's cash burn

OpenAI burned $2.5 billion in cash in the first half of 2025 on $4.3 billion in revenue16% higher than its full-year 2024 revenue. The company is reportedly on track for $13 billion in full-year revenue and $8.5 billion in cash burn. Losses are concentrated in R&D and the cost of running ChatGPT, with a significant non-cash component from stock compensation. Eskildsen is not particularly alarmed by the burn rate given the scale of the agentic commerce opportunity and the company's demonstrated ability to raise capital. The dot-com comparison has limits here: OpenAI's revenue is real transactional spend, not eyeballs. The value-to-investment gap that took five to ten years to close after the dot-com crash may close faster this cycle, given the pace of model improvement.

On whether labs prioritize compute efficiency, Eskildsen is blunt: they don't. Researchers would absorb double the compute overnight if it were available. He does note that Google has the strongest structural incentive to optimize efficiency, since running frontier models across every search query demands exceptional price-performance — a constraint that has produced Gemini Flash and driven the Chinese labs to similar discipline.

AI bubble debate

Eskildsen declines to call the current environment a bubble or not, but his framing is useful. A bubble exists when the gap between value and investment is too wide. Whether that's true of AI today, he says, is genuinely hard to assess — there's no clean precedent. He runs Turbopuffer to be resilient regardless, which he treats as more honest than a confident macro call either way.

Nvidia's financing play

A Twitter analysis argues that an NVL72 rack generating 1.5 million tokens per second at $0.40 per million tokens, even with 70% annual price compression, produces roughly $26 million in revenue over three years against a $4 million rack cost — implying ~$22 million in net economics per rack. Eskildsen adds an unreported detail: Nvidia is allegedly extending interest-free loans to GPU buyers and in some cases guaranteeing token demand once capacity comes online. His read on the motive is customer concentration. Nvidia's revenue is heavily concentrated among a small number of hyperscalers, and financing smaller players — Bitcoin miners pivoting to AI, independent data center operators — is a way to diversify that base. Even if smaller buyers lack the operational sophistication to run inference at scale, the thesis is that someone else will step in to operate the hardware if they can't. The risk he identifies is operators who secure land, power, and GPUs but never build a usable end product.

AI as a feature, not a moat

The broader SaaS displacement narrative gets a sceptical treatment. Eskildsen sees AI as a feature set that gets layered into existing products rather than a clean reset of the software landscape. The competitive outcome depends heavily on company vintage: businesses founded 40-plus years ago face real structural risk; founders eight to ten years in with capital and customer relationships can likely bolt on AI and defend their position; new entrants can win only against genuinely sleepy incumbents.

The harder disruption case is billing model, not product. Brett Taylor's Sierra is the example: competing against a seat-based customer service platform by pricing on outcomes forces incumbents to renegotiate contracts, rebuild products, and risk losing customers who start shopping alternatives the moment the conversation opens. Stacking AI capability on top of a shifted business model — outcome-based pricing, different unit economics — is where Eskildsen sees a real path to building a generational company. Either lever alone is insufficient.

Six reasons to raise

Eskildsen lays out a framework he uses when founders pitch him on fundraising:

  1. Fund R&D — the most defensible reason
  2. Fund growth — valid if there's a clear capital-to-revenue conversion
  3. Fund ego — momentum raising or "because I could"; common, irresponsible
  4. Fund employee liquidity
  5. Publicity and trust — the Sequoia halo effect; matters in some markets
  6. Strategic partnerships or investors

Having one solid reason is acceptable. Two is a strong case. Three is a very strong case. His implicit point is that most founders raising right now are operating on reason three.