The US Energy Information Administration says electricity demand will hit record highs in 2026 and 2027. AI and the data centers it requires are driving the surge. For years, US power consumption was flat. Now it's climbing because training and running large language models takes enormous amounts of electricity. Tech giants are scrambling for power. Microsoft signed a 20-year deal with Constellation Energy to restart Three Mile Island Unit 1, bringing 837 megawatts online by 2028. Amazon partnered with Energy Northwest for four small modular reactors in Washington. Google cut a deal with Kairos Power. Oracle's Larry Ellison says they're designing data centers powered by three small nuclear reactors. There's a catch, though. Small modular reactors exist mostly on paper right now. None are operating commercially in the US. The first Kairos reactor won't deliver power until at least 2030. Nuclear appeals because AI needs power around the clock, and wind and solar can't guarantee that. But the timelines being thrown around are optimistic at best. Power infrastructure takes years to build. Chips get faster every 18 months. Some data center projects are already on hold, waiting for grid capacity. That scaling mismatch could throttle AI development faster than any limitation in the models themselves. For startups, this is a real problem. The big players can sign 20-year nuclear deals and wait. Everyone else competes for whatever grid capacity is left. If power becomes the binding constraint on AI, the winners won't be the teams with the best models. They'll be the ones who secured the electrons.