Abstract

We augment the Kolmogorov Structure Function with energy cost and drive the concept of “Additive AI” where Machine Learning Models are created by traversing the Kolmogorov Structure function from low model complexity to high while seeking models achieving the Kolmogorov Minimum Sufficient Statistic with least energy cost. In this way, the intersection of Algorithmic Information Theory (AIT) with Machine Learning (ML) can enable optimization of the “Entropy Economy,” where the precious resource of entropy flow is managed to jointly optimize computation, energy, and learning. In this paper we lay out the Kolmogorov Learning Cycle as a framework for this joint optimization and demonstrate the energy efficient machine learning algorithm Least Energy Usage Network (LEAN) as an example of how restraining complexity can reduce learning energy cost while maintaining performance. We motivate further directions for optimizing how AI models can be optimally learned and discuss additional opportunities to optimize where and when AI and machine learning models can be created to maximize learning while minimizing energy (and subsequently carbon costs) through the intersection of AIT and ML.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call