The advent of Artificial Intelligence systems, in particular of generative models like ChatGPT, has resulted in one more area requiring heavy computational resources which in turn consumes a lot of energy and water. By estimations, one interaction with ChatGPT for instance will take an estimate of 2.9 watt hour, which is ten times higher than the amount of energy consumed to conduct an ordinary googling task that is 0.3 watt hours. This stands as a call for action toward improving the water to energy ratio of the AI systems and therefore the recent carbon emissions. This paper explores the energy efficiency patterns of AI languages such as chatbots compared with the other means of searching the internet like Google and how the effects of the AI machines on the environment can be reduced. In this connection, green cloud computing methods have been suggested as possible solutions that can be effectively combined with the principles of clean energy use; on this list are both advanced systems for maintaining low temperatures and the optimization of AI systems. Finally, using of resources could also play a crucial part in the ultimate decrease in the adverse effects that the AI industry has on our environment.
Read full abstract