Greggman had a problem. His VR video player broke in January when Meta deprecated some APIs, and his WebXR replacement didn't pan out. So he turned to Claude and built a new player in Rust, a language he'd never used, with OpenXR and wgpu, frameworks he had zero experience with. The result is Equirect, a working VR video player. Claude wrote almost all of it.

Over roughly 20 hours of supervision, Greggman guided Claude through architecture decisions and caught questionable code patterns. This workflow mirrors the approach Lalit Maganti used to ship his project in just three months. One example stands out. Claude tried fixing a UI bug by patching the same solution in seven different places instead of solving it once.

When Claude initially built a sphere mesh for video projection, Greggman pushed back. Why use a mesh when you can handle projection entirely in the shader? Claude adjusted and got it working.

What stands out here: someone with zero Rust, OpenXR, or wgpu knowledge shipped a functional VR application by acting as architect and code reviewer. The genuinely hard parts, connecting wgpu to OpenXR at a low level and enabling hardware-accelerated video decoding, Claude figured out alone. The player handles local files and network URLs, includes playback controls, and runs cross-platform via WebGPU.

This is where AI-assisted development sits today. Not full autonomy, but a workflow where humans provide direction and taste while AI handles implementation grunt work. Greggman may extract the video-to-GPU components into a separate Rust crate. The code is MIT licensed on GitHub.