Home 9 AI 9 Thermodynamic Computing for Energy-Efficient AI

Thermodynamic Computing for Energy-Efficient AI

by | Jan 30, 2026

New physics-based hardware uses natural noise to cut the cost of machine learning.
This is a representation of a coupling pattern between representative hidden units and a visible layer from an independent dynamical trajectory of Whitelam’s trained denoising thermodynamic computer (source: Nicole Millman; Stephen Whitelam).

 

Thermodynamic computing is an emerging computing paradigm aimed at making artificial intelligence far more energy efficient by tapping physical processes that conventional chips normally suppress. Researchers and startups such as Normal Computing are exploring systems that use thermal fluctuations and noise as computational resources rather than fighting them, a departure from digital logic where noise is an enemy of accuracy. This approach could dramatically reduce the energy cost of AI tasks such as image generation and probabilistic computation.

At the heart of the idea is a shift from deterministic digital computing to physics-based circuits that settle into equilibrium states representing solutions. In one prototype described in this recent IEEE Spectrum article, linked resonators and couplers are initialized with random noise; as the system naturally evolves, the final configuration encodes the answer to a programmed problem. Studies have shown that this method can implement training steps of neural-network-style models and generate simple images with far less energy than current digital accelerators require.

Foundational research published in Nature Communications and related work detail hardware called a stochastic processing unit (SPU) that uses RLC circuits to perform sampling and matrix tasks central to machine learning. Simulations indicate that training and sampling that is core operations in generative AI, could be done with much lower energy if the hardware scales.

Despite the promise, thermodynamic computers today remain early-stage. Prototypes are rudimentary and not yet competitive with mainstream systems in performance or general-purpose use. Experts emphasize that significant engineering challenges remain before such hardware can handle the depth and complexity of models such as modern diffusion networks.

Yet the potential is large: by matching physical dynamics to probabilistic computation, thermodynamic computing could help data centers and AI infrastructure dramatically cut power consumption at a time when energy costs and climate impacts are a central concern.