TechRxiv
Revamped CO2e Paper 9Apr22.pdf (511.26 kB)
Download file

The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink

Download (511.26 kB)
preprint
posted on 13.04.2022, 05:11 by David PattersonDavid Patterson, Joseph Gonzalez, Urs Hölzle, Quoc Hung Le, Chen Liang, Lluis-Miquel Munguia, Daniel Rothchild, David So, Maud Texier, Jeffrey Dean

Machine Learning (ML) workloads have rapidly grown in importance, but raised concerns about their carbon footprint. Four best practices can reduce ML training energy by up to 100x and CO2 emissions up to 1000x. By following best practices, overall ML energy use (across research, development, and production) held steady at <15% of Google’s total energy use for the past three years. If the whole ML field were to adopt best practices, total carbon emissions from training would reduce. Hence, we recommend that ML papers include emissions explicitly to foster competition on more than just model quality. As estimates of emissions in papers that omitted them have been off 100x–100,000x, publishing emissions has the added benefit of ensuring accurate accounting. Given the importance of climate change, we must get the numbers right to make certain that we work on its biggest challenges.

History

Email Address of Submitting Author

davidpatterson@google.com

ORCID of Submitting Author

0000-0002-0429-1015

Submitting Author's Institution

Google

Submitting Author's Country

United States of America

Usage metrics

Licence

Exports