Alibaba's Qwen team just dropped Qwen3.6-35B-A3B, an open-weight model built for agentic coding. The 35B parameter model handles complex coding loops in languages like Rust and Elixir. It responds to compiler errors on its own. Iterates toward working solutions without hand-holding. Early feedback from developers on Hacker News is positive.
What stands out is that this release happened at all. The Qwen team has been gutted by internal restructuring. The tension: GPU costs versus the team's revenue. Lead researcher Junyang Lin departed during the upheaval. He's now CTO at 01.AI, Kai-Fu Lee's startup. Commenters on Hacker News describe the unit as kneecapped. They're still shipping open models. That's surprising.
The model doesn't beat top commercial options. Compared to Anthropic's Claude Sonnet 4.5, it trails in overall capability. But for smaller tasks and local agentic workflows, it holds its own. That matters for developers who want to run something locally. No API costs for every compiler error loop. No rate limits.
The bigger question is what happens next. Speculation around **Qwen 3.6** is swirling. Will it stay open-weight? Some wonder if Alibaba will push the team toward enterprise monetization instead. Qwen3.6-35B-A3B is available now. Whether there's a Qwen4 is anyone's guess.