Stefan Kruger presented "Dyalog and AI" at DYNA Fall 2025, Dyalog's annual conference for APL users and developers, arguing for where the language fits in modern AI development. The talk is on YouTube. Dyalog has not published a transcript or written summary, so specific claims from the presentation aren't available for this report — what follows is the verifiable context.

The technical argument for APL in AI is real. Matrix multiplication, broadcasting, and reduction are native idioms in APL — the same operations at the core of neural network training and inference. Dyalog APL, a commercial implementation of the language family, handles these with terse symbolic notation that collapses into single expressions what Python spreads across multiple NumPy calls. The math aligns. The ecosystem doesn't.

Python owns the AI stack. PyTorch, JAX, and the infrastructure built around them are where research happens, where engineers are trained, and where models ship. APL's array-language relatives — J, BQN, Q — share the same computational strengths and face the same wall. Technical alignment with the underlying math hasn't produced adoption.

Dyalog's decision to put AI at the centre of DYNA Fall 2025 reflects that pressure. The company has roughly 40 years of APL implementation behind it, with a user base concentrated in finance and government — sectors that have been slower to reach for the Python ML stack than tech companies. Putting "Dyalog and AI" on the conference schedule is an acknowledgment that the question can't be deferred.

The full talk is on Dyalog's YouTube channel.