Textbooks on Active Inference and Bayesian Mechanics
My textbook, Fundamentals of Active Inference, will be available in March 2026, published by The MIT Press. The supplemental material associated with this textbook, including Jupyter notebooks and proofs for all equations in the book, is currently in preparation and will be released after the book’s publication. I am currently working on a second textbook, Fundamentals of Bayesian Mechanics. The textbooks are focused for an engineering audience – machine learning engineers and applied engineers – who are interested in applying active inference in the real world.
Collectively, these books aim to provide a technical introduction to the research areas of Active Inference, the Free Energy Principle, and Bayesian Mechanics. The mathematical ideas within these fields have the potential to revolutionize many industries and greatly advance theoretical AI research beyond what has already been achieved through deep learning and large language models in the last decade. My intention for this work is to condense over 20 years of progress and literature in the field to bring a wider degree of recognition and interest from machine learning researchers and engineers in academia and industry.
The textbooks are aimed at an upper undergraduate/early graduate level and aims to be fairly self-contained, requiring little knowledge of neuroscience and related literature but focusing heavily on building intuition, understanding the mathematics, and implementation in Python code. The prerequisites for the book are basic Python programming knowledge, probability theory, multivariate calculus, basic linear algebra, and high-school physics, topics usually covered in undergraduate natural science and engineering degrees.
Background
Active Inference is a fast-growing field within computational neuroscience that integrates ideas from machine learning, Bayesian statistics, physics, and other fields to mathematically describe the brain and human/animal behavior. The original ideas were conceived by the neuroscientist Dr. Karl Friston who has worked with a number of collaborators in the last two decades to expand the scope and applicability of the theory to the brain and living systems more generally. Active Inference has been successfully applied to many different areas of neuroscience and represents the culmination of decades of prior research. It provides an interdisciplinary way to look at a broad range of fields ranging from robotics, economics, cybernetics/control theory, machine learning and more.
The Free Energy Principle is a broader and more theoretical physics-based theory also created by Friston. The Free Energy Principle provides the theoretical and philosophical scaffold behind Active Inference and a mathematical modeling framework known as Bayesian Mechanics to model living systems that undergo “self-organization”. That is, such systems adapt to changes to their surroundings to maintain their structure over time. Under the Free Energy Principle, it is suggested that biological or artificial organisms follows the imperative of minimizing a statistical quantity known as variational free energy (negative evidence lower bound) which serves as a universal objective function in approximate Bayesian inference. Organisms that successfully minimize this quantity are the ones that are able to (actively) infer the current (and future) states of their environments and thus restrict themselves to preferable states conducive to their survival.
Supplemental Materials
Although the supplemental materials are not yet available you can click on the links below to learn about the content I have planned. Supplemental materials will be added after the book’s publication.