For decades, optical computing has been the punchline of photonics conferences. "Five years away" since 1985. The physics worked beautifully in lab conditions, but Mach-Zehnder Interferometers (MZIs), the devices that use light interference to perform calculations, were too thermally sensitive for real data centers. A degree or two of temperature drift would corrupt results. Keeping them stable used so much power that you lost the efficiency gains you were chasing. That calculus is shifting, and AI inference doesn't need high precision anymore. Modern neural networks run fine on 4-8 bit operations, down from 32-bit floating point. MZI thermal drift introduces small errors, errors that don't matter when your computation is already approximate. Recent work has shown photonic neural networks matching electronic chip accuracy on MZI meshes. Engineers have also stopped treating thermal sensitivity as fundamental. Athermal designs using mixed waveguide materials cut wavelength drift by 24x. Integrated heaters with thin-film resistors keep emission stable within 0.5 picometres over 50 hours. A "thermal undercut" technique in standard 300mm CMOS foundries improved tuning efficiency fivefold. The economic pressure compounds all of this: AI's energy consumption is forcing the industry to look at GPU alternatives.

The startup landscape is already heating up. Lightmatter has raised over $400 million to build 3D-stacked photonic chips using MZI-like meshes for matrix multiplication. Celestial AI and Ayar Labs are focusing on optical interconnects first, tackling the data movement bottleneck rather than replacing transistors entirely. LightOn, a French company, went public on Euronext Growth using diffractive optics to avoid the thermal tuning headaches of dense MZI meshes. Optalysys in the UK is pursuing Fourier optics and free-space processing.

The hardware physics might finally be solved, or close enough. Everything else is harder. GPUs have CUDA. They have a massive developer ecosystem. Photonic startups need developers to adopt unproven compilers and toolchains to map neural networks onto analog hardware. Building that ecosystem from scratch takes years.