Abstract
Cloud computing has raised significant concerns about their environmental impact, particularly in terms of energy consumption and carbon emissions. This review paper provides a comprehensive analysis of the energy consumption trends in AI, with a particular focus on inference costs in both cloud and edge computing scenarios. By consolidating data from recent research, this paper presents a nuanced view of energy consumption trends, distinguishing between cutting-edge models and those in general use. The findings reveal that while state-of-the-art AI models show exponential growth in energy consumption, average models demonstrate more stable or even decreasing energy use patterns, largely due to improvements in hardware efficiency and algorithmic innovations. The review also explores potential solutions to mitigate AI's environmental impact, including advanced hardware designs, energy-efficient algorithms, and novel data acquisition techniques.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have