Skip to main content
UsedBy.ai
All articles
Trend Analysis3 min read
Published: February 5, 2026

The Economics of Comma.ai’s On-Premise AI Training

Comma.ai operates a $5M on-premise data center to train models for Openpilot, a mature robotics OS supporting over 300 vehicle models (GitHub). The company argues that owning hardware forces engineers

Marcus Webb
Marcus Webb
Senior Backend Analyst

The Pitch

Comma.ai operates a $5M on-premise data center to train models for Openpilot, a mature robotics OS supporting over 300 vehicle models (GitHub). The company argues that owning hardware forces engineers to solve fundamental constraints of Watts and FLOPs rather than masking inefficiencies with cloud budgets (Comma.ai Blog).

Under the Hood

Comma.ai spent $540,112 on power alone in 2025, utilizing roughly 450kW at peak capacity to sustain their training workloads (Comma.ai Blog). While hyperscalers like AWS and Azure are currently 4x to 10x more expensive for raw compute compared to bare-metal alternatives, on-premise hardware brings distinct geographic liabilities (UMH Learning Center).

The primary bottleneck is the "San Diego power tax," where electricity costs exceed 40c/kWh, approximately three times the global average (HN / inewsource). This is aggravated by local political dysfunction and impending California legislation. Senate Bills 886 and 887, introduced in early 2026, may force data centers to pay additional tariffs to support the regional grid (UsedBy Dossier).

There is also the risk of "stranded compute" due to infrastructure lag. In major markets, power hookups for new hardware can take five to seven years, leaving expensive GPUs idle despite being fully paid for (YouTube Analysis). We don’t know yet what their exact hardware inventory looks like or if they have successfully transitioned to post-Blackwell architectures.

Details regarding Comma.ai's plan to "produce our own power" remain missing from public documentation (UsedBy Dossier). Managing multi-location resilience and network stability is frequently cited by the community as a prohibitive overhead for most AI startups (HN).

Marcus's Take

Owning the stack is a viable strategy only if your training workloads are as predictable as a heartbeat. Comma.ai's success is tied to Openpilot’s maturity; they aren't experimenting with model architectures every week, they are refining a specific robotics pipeline. For anyone else, building a data center in a region with 40c/kWh power is just a very expensive way to keep your office warm. Unless you have 100% GPU saturation and a 10-year horizon, stick to bare-metal providers like Hetzner to avoid the "San Diego tax."


Ship clean code,
Marcus.

Marcus Webb
Marcus Webb

Marcus Webb - Senior Backend Analyst at UsedBy.ai

Related Articles

Stay Ahead of AI Adoption Trends

Get our latest reports and insights delivered to your inbox. No spam, just data.