Climate Impact Of AI

Revision as of 19:59, 27 March 2021 by LaurenceW (talk | contribs) (More on cloud giants)

AI and machine learning has a climate impact from the energy used at all stages from development to deployment, as well as considerations around embodied energy in hardware from GPUs to data centres. The majority of emissions come from CO2 released from electricity generation powering AI processes. The carbon intensity of any particular project is highly dependent on its location and the electricity mix that it consumes.

Efforts to quantify the CO2 impact of ML have been undertaken[1][2], and tools and recommendations for best practice have been created[3][4].

Deep Learning

Models and training

Many recent projects in deep learning have used increasingly large computational resources to create models.

The compute resources used for the largest model training runs was shown to have doubled in size every 3.4 months from 2012-2018[5].

Inference

Some estimates show the vast majority of energy (80-90%) used by deep learning models is spent on inference[6][7].

Efficiency gains

Redundant training

Experiment tracking[8]

Smart hyperparameter selection

Data Centres

Data centres accounted for almost 1% of global energy demand in 2019[9], at around 200TWh, and while demand increases, efficiency gains mean this may stay flat for now[10]. AI's total impact then can be estimated as a fraction of this. AI may itself offer efficiency gains for data centres by optimising control systems[11].

Estimates of electricity use of AI

Cloud comparison

The major cloud computing providers, Amazon, Google and Microsoft, have varying targets and carbon intensities for their services.

Google now publishes hourly estimates of the proportion of carbon-free energy (CFE) and the carbon intensity for all its cloud regions[12].

Energy buying

Targets

AWS, 100% renewable energy for its data centers by 2030, on track for 2025[13]

Google, 24/7 carbon-free energy (real-time matching of supply and demand, without buying renewable generation certificates) by 2030[14].

Microsoft, carbon negative by 2030[15].

Processing Units

GPUs

Development of other chip types - TPUs, IPUs etc

References

  1. Lacoste, Alexandre; Luccioni, Alexandra; Schmidt, Victor; Dandres, Thomas (2019-10-21). "Quantifying the Carbon Emissions of Machine Learning". arXiv:1910.09700 [cs, stat].
  2. Henderson, Peter; Hu, Jieru; Romoff, Joshua; Brunskill, Emma; Jurafsky, Dan; Pineau, Joelle (2020-01-31). "Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning". arXiv:2002.05651 [cs].
  3. "ML CO2 Impact". Retrieved 2021-03-27.
  4. "Experiment Impact Tracker". 2021-03-27.
  5. "AI and Compute". OpenAI. 2018-05-16. Retrieved 2021-03-27.
  6. Strategy, Moor Insights and. "Google Cloud Doubles Down On NVIDIA GPUs For Inference". Forbes. Retrieved 2021-03-27.
  7. Jassy, Andy. "AWS re:Invent 2018 - Keynote with Andy Jassy". YouTube. Retrieved 2021-03-27.
  8. Biewald, Lukas (2019-06-24). "Deep Learning and Carbon Emissions". Medium. Retrieved 2021-03-27.
  9. "Data Centres and Data Transmission Networks – Analysis". IEA. Retrieved 2021-03-27.
  10. Masanet, Eric; Shehabi, Arman; Lei, Nuoa; Smith, Sarah; Koomey, Jonathan (2020-02-28). "Recalibrating global data center energy-use estimates". Science. 367 (6481): 984–986. doi:10.1126/science.aba3758. ISSN 0036-8075. PMID 32108103.
  11. "DeepMind AI reduces energy used for cooling Google data centers by 40%". Google. 2016-07-20. Retrieved 2021-03-27.
  12. "Carbon free energy for Google Cloud regions". Google Cloud. Retrieved 2021-03-27.
  13. "Amazon becomes the world's largest corporate purchaser of renewable energy". UK Day One Blog. 2020-12-10. Retrieved 2021-03-27.
  14. "Google Cloud aims for carbon-free energy for its data centers". Google Cloud Blog. Retrieved 2021-03-27.
  15. "Microsoft will be carbon negative by 2030". The Official Microsoft Blog. 2020-01-16. Retrieved 2021-03-27.