A freely distributed MIT Press textbook whose entire practical infrastructure runs on Google tools, written by a Google DeepMind researcher, and endorsed in part by Google-affiliated scientists: Kevin Patrick Murphy's "Probabilistic Machine Learning: An Introduction" is as close to canonical graduate ML reading as anything published in the last decade — and it's worth being clear about what that canonization looks like in practice.

Murphy published the book through MIT Press in March 2022. It's freely available as a continuously updated draft PDF under a CC-BY-NC-ND license; the most recent revision is dated April 2025. The book covers probability theory, Bayesian inference, statistical decision theory, information theory, and optimization, and explicitly bridges those classical foundations to modern deep learning architectures. Murphy frames it as a substantial expansion of his influential 2012 "Machine Learning: A Probabilistic Perspective."

The code integration is a genuine differentiator. Nearly every figure links directly to a Google Colab notebook, with implementations in Python using JAX and TensorFlow. Readers can reproduce experiments and visualizations from the PDF without any local setup. Supplementary code for concepts not tied to specific figures is maintained at code.probml.ai, and Murphy has kept the codebase open to contributions since publication.

Endorsements come from prominent names across the field. Geoff Hinton (University of Toronto/Google) praised its grounding of deep learning in classical statistical principles. Chris Bishop of Microsoft Research called it a "must-have" for anyone seeking deep understanding of ML. Daphne Koller of insitro described it as her recommended text for newcomers wanting a comprehensive view of core principles. Tom Dietterich of Oregon State University put it plainly: "what every ML PhD student should know."

The Google thread running through the book — DeepMind author, JAX and TensorFlow implementations, Colab hosting, Google-affiliated endorsers — isn't incidental. For researchers trained on Murphy's textbook, Google's tooling stack is the default substrate for hands-on ML from day one. That's durable ecosystem influence, whatever the intent behind it. A companion volume, "Probabilistic Machine Learning: Advanced Topics," extends the framework into cutting-edge methods using the same infrastructure.