loading page

Sustainable Artificial Intelligence Systems: An Energy Efficiency Approach
  • +5
  • Ignacio Hidalgo ,
  • Francisco Fenández-de_Vega ,
  • Josu Ceberio ,
  • Oscar Garnica ,
  • J. Manuel Velasco ,
  • Juan Carlos Cortés ,
  • Rafael Villanueva ,
  • Josefa Díaz
Ignacio Hidalgo
Universidad Complutense de Madrid

Corresponding Author:[email protected]

Author Profile
Francisco Fenández-de_Vega
Author Profile
Josu Ceberio
Author Profile
Oscar Garnica
Author Profile
J. Manuel Velasco
Author Profile
Juan Carlos Cortés
Author Profile
Rafael Villanueva
Author Profile
Josefa Díaz
Author Profile


The energy consumption of Artificial Intelligence (AI) systems has increased 300,000-fold from 2012 to now, and data centers running massive AI software produce up to 5-9% of global electricity demand and 2% of all CO2 emissions. Such an increase in energy consumption has been partially motivated by the strong development of new AI-specific architectures to improve the performance of AI models. Nevertheless, the AI community has recently become aware of the importance of considering energy efficiency as a metric when developing AI techniques. To date, a great effort has been made to find optimal AI model configurations that provide the best solution in the shortest possible time. However, only a few works have sought a compromise between energy cost and system performance. This paper analyses recent efforts in these directions and proposes the path toward energy-efficient AI. We describe a set of energy efficiency strategies for applying and deploying AI models on different computing infrastructures in search of democratizing an environmentally sustainable AI. To that end, we propose a full-stack approach of energy-efficient AI and analyze the role that different types of users should play, tackling the energy-focused optimization in all steps of the AI model design flow, from the high levels of models and algorithm design to the low levels ones, more related to the hardware and architecture.