loading page

A Hard Energy Use Limit of Artificial Superintelligence
  • Klaus Stiefel ,
  • Jay S. Coggan
Klaus Stiefel
Neurolinx Institute, Neurolinx Institute

Corresponding Author:[email protected]

Author Profile
Jay S. Coggan
Author Profile

Abstract

We argue that the high energy use by present-day semiconductor computing technology will prevent the emergence of an artificial intelligence system which could reasonably be described as a “superintelligence”.
This hard limit on artificial superintelligence (ASI) emerges from the energy requirements of an intelligent system more intelligent, and orders of magnitude less efficient in energy use than human brains. Furthermore, an ASI would have to supersede not only a single human brain, but a large community of humans, and hence expend multiple times the energy needed to replicate the power of a single human brain.
A hypothetical ASI would likely consume enormous amounts of energy, possibly orders of magnitude above what is available in industrialized society, making it impossible on energetic grounds alone. We estimate the energy use by ASI in excess of a human brain with an equation we term the ”Erasi equation”, for the Energy Requirement for Artificial SuperIntelligence.
An additional challenge is the current developmental trajectory of AI research, the majority of which is not focused on the creation of superintelligent systems. An extremely sophisticated technology like the hypothesized ASI will typically not emerge by chance from scattered efforts.
Taken together, these arguments suggest that the emergence of an ASI is highly unlikely, if not impossible, in the foreseeable future based on current computer architectures, primarily due to energy constraints.