Abstract

We augment the Kolmogorov Structure Function with energy cost and drive the concept of “Additive AI” where Machine Learning Models are created by traversing the Kolmogorov Structure function from low model complexity to high while seeking models achieving the Kolmogorov Minimum Sufficient Statistic with least energy cost. In this way, the intersection of Algorithmic Information Theory (AIT) with Machine Learning (ML) can enable optimization of the “Entropy Economy,” where the precious resource of entropy flow is managed to jointly optimize computation, energy, and learning. In this paper we lay out the Kolmogorov Learning Cycle as a framework for this joint optimization and demonstrate the energy efficient machine learning algorithm Least Energy Usage Network (LEAN) as an example of how restraining complexity can reduce learning energy cost while maintaining performance. We motivate further directions for optimizing how AI models can be optimally learned and discuss additional opportunities to optimize where and when AI and machine learning models can be created to maximize learning while minimizing energy (and subsequently carbon costs) through the intersection of AIT and ML.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.