52
0

Thermodynamic Bound on Energy and Negentropy Costs of Inference in Deep Neural Networks

Abstract

The fundamental thermodynamic bound is derived for the energy cost of inference in Deep Neural Networks (DNNs). By applying Landauer's principle, we demonstrate that the linear operations in DNNs can, in principle, be performed reversibly, whereas the non-linear activation functions impose an unavoidable energy cost. The resulting theoretical lower bound on the inference energy is determined by the average number of neurons undergoing state transition for each inference. We also restate the thermodynamic bound in terms of negentropy, a metric which is more universal than energy for assessing thermodynamic cost of information processing. Concept of negentropy is further elaborated in the context of information processing in biological and engineered system as well as human intelligence. Our analysis provides insight into the physical limits of DNN efficiency and suggests potential directions for developing energy-efficient AI architectures that leverage reversible analog computing.

View on arXiv
@article{tkachenko2025_2503.09980,
  title={ Thermodynamic Bound on Energy and Negentropy Costs of Inference in Deep Neural Networks },
  author={ Alexei V. Tkachenko },
  journal={arXiv preprint arXiv:2503.09980},
  year={ 2025 }
}
Comments on this paper