Climate Impact Of AI

Revision as of 19:14, 27 March 2021 by LaurenceW (talk | contribs)

AI and machine learning has a climate impact from the energy used at all stages from development to deployment, as well as considerations around embodied energy in hardware from GPUs to data centres. Efforts to quantify the CO2 impact of ML have been undertaken[1][2], and tools and recommendations for best practice have been created[3][4].


Thoughtful blog from W&B: https://towardsdatascience.com/deep-learning-and-carbon-emissions-79723d5bc86e

Models

Many recent projects in deep learning have used increasingly large computational resources to create models[ x, y, z].

The compute resources used for the largest model training runs was shown to have doubled in size every 3.4 months from 2012-2018[5].

Training

Redundant training

Inference

Some estimates show the vast majority of energy (80-90%) used by deep learning models is spent on inference[6][7].

References

  1. Lacoste, Alexandre; Luccioni, Alexandra; Schmidt, Victor; Dandres, Thomas (2019-10-21). "Quantifying the Carbon Emissions of Machine Learning". arXiv:1910.09700 [cs, stat].
  2. Henderson, Peter; Hu, Jieru; Romoff, Joshua; Brunskill, Emma; Jurafsky, Dan; Pineau, Joelle (2020-01-31). "Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning". arXiv:2002.05651 [cs].
  3. "ML CO2 Impact". Retrieved 2021-03-27.
  4. "Experiment Impact Tracker". 2021-03-27.
  5. "AI and Compute". OpenAI. 2018-05-16. Retrieved 2021-03-27.
  6. Strategy, Moor Insights and. "Google Cloud Doubles Down On NVIDIA GPUs For Inference". Forbes. Retrieved 2021-03-27.
  7. Jassy, Andy. "AWS re:Invent 2018 - Keynote with Andy Jassy". YouTube. Retrieved 2021-03-27.