Editing Climate Impact Of AI

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.

Latest revision Your text
Line 4: Line 4:
   
 
AI and machine learning has a climate impact from the energy used at all stages from development to deployment, as well as considerations around embodied energy in hardware from GPUs to data centres. The majority of emissions come from CO2 released from electricity generation powering AI processes. The carbon intensity of any particular project is highly dependent on its location and the electricity mix that it consumes.
 
AI and machine learning has a climate impact from the energy used at all stages from development to deployment, as well as considerations around embodied energy in hardware from GPUs to data centres. The majority of emissions come from CO2 released from electricity generation powering AI processes. The carbon intensity of any particular project is highly dependent on its location and the electricity mix that it consumes.
  +
 
Efforts to quantify the CO2 impact of ML have been undertaken<ref name=":0">{{Cite journal|last=Lacoste|first=Alexandre|last2=Luccioni|first2=Alexandra|last3=Schmidt|first3=Victor |last4=Dandres|first4=Thomas|date=2019-10-21|title=Quantifying the Carbon Emissions of Machine Learning|url=https://arxiv.org/abs/1910.09700|journal=arXiv:1910.09700 [cs, stat]}}</ref><ref>{{Cite journal|last=Henderson|first=Peter|last2=Hu|first2=Jieru|last3=Romoff|first3=Joshua|last4=Brunskill|first4=Emma|last5=Jurafsky|first5=Dan|last6=Pineau|first6=Joelle|date=2020-01-31|title=Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning|url=http://arxiv.org/abs/2002.05651|journal=arXiv:2002.05651 [cs]}}</ref>, and tools and recommendations for best practice have been created<ref name=":1">{{Cite web|last=|first=|date=|title=ML CO2 Impact|url=https://mlco2.github.io/impact/|url-status=live|archive-url=|archive-date=|access-date=2021-03-27|website=}}</ref><ref>{{Cite web|url=https://github.com/Breakend/experiment-impact-tracker|title=Experiment Impact Tracker|date=2021-03-27|website=|url-status=live|archive-url=|archive-date=|access-date=}}</ref>.
   
 
== Machine Learning Application Areas ==
 
== Machine Learning Application Areas ==
   
===Deep Learning===
+
===Models and training===
* '''Models and training''': Many recent projects in deep learning have used increasingly large computational resources to create models. The compute resources used for the largest model training runs was shown to have doubled in size every 3.4 months from 2012-2018<ref>{{Cite web|url=https://openai.com/blog/ai-and-compute/|title=AI and Compute|date=2018-05-16|website=OpenAI|language=en|access-date=2021-03-27}}</ref>.
+
Many recent projects in deep learning have used increasingly large computational resources to create models.
* '''Inference''': Some estimates show the vast majority of energy (80-90%) used by deep learning models is spent on inference<ref>{{Cite web|url=https://www.forbes.com/sites/moorinsights/2019/05/09/google-cloud-doubles-down-on-nvidia-gpus-for-inference/|title=Google Cloud Doubles Down On NVIDIA GPUs For Inference|last=Strategy|first=Moor Insights and|website=Forbes|language=en|access-date=2021-03-27}}</ref><ref>{{Cite web|url=https://youtu.be/ZOIkOnW640A?t=5327|title=AWS re:Invent 2018 - Keynote with Andy Jassy|last=Jassy|first=Andy|date=|website=YouTube|url-status=live|archive-url=|archive-date=|access-date=2021-03-27}}</ref>.
 
* '''Efficiency gains'''. Redundant training. Experiment tracking<ref>{{Cite web|url=https://towardsdatascience.com/deep-learning-and-carbon-emissions-79723d5bc86e|title=Deep Learning and Carbon Emissions|last=Biewald|first=Lukas|date=2019-06-24|website=Medium|language=en|access-date=2021-03-27}}</ref>. Smart hyperparameter selection
 
   
  +
The compute resources used for the largest model training runs was shown to have doubled in size every 3.4 months from 2012-2018<ref>{{Cite web|url=https://openai.com/blog/ai-and-compute/|title=AI and Compute|date=2018-05-16|website=OpenAI|language=en|access-date=2021-03-27}}</ref>.
=== Data Centres ===
 
  +
  +
===Inference===
 
Some estimates show the vast majority of energy (80-90%) used by deep learning models is spent on inference<ref>{{Cite web|url=https://www.forbes.com/sites/moorinsights/2019/05/09/google-cloud-doubles-down-on-nvidia-gpus-for-inference/|title=Google Cloud Doubles Down On NVIDIA GPUs For Inference|last=Strategy|first=Moor Insights and|website=Forbes|language=en|access-date=2021-03-27}}</ref><ref>{{Cite web|url=https://youtu.be/ZOIkOnW640A?t=5327|title=AWS re:Invent 2018 - Keynote with Andy Jassy|last=Jassy|first=Andy|date=|website=YouTube|url-status=live|archive-url=|archive-date=|access-date=2021-03-27}}</ref>.
  +
  +
=== Efficiency gains ===
  +
Redundant training
  +
 
Experiment tracking<ref>{{Cite web|url=https://towardsdatascience.com/deep-learning-and-carbon-emissions-79723d5bc86e|title=Deep Learning and Carbon Emissions|last=Biewald|first=Lukas|date=2019-06-24|website=Medium|language=en|access-date=2021-03-27}}</ref>
  +
  +
Smart hyperparameter selection
  +
 
== Data Centres ==
 
Data centres accounted for almost 1% of global energy demand in 2019<ref>{{Cite web|url=https://www.iea.org/reports/data-centres-and-data-transmission-networks|title=Data Centres and Data Transmission Networks – Analysis|website=IEA|language=en-GB|access-date=2021-03-27}}</ref>, at around 200TWh, and while demand increases, efficiency gains mean this may stay flat for now<ref>{{Cite journal|last=Masanet|first=Eric|last2=Shehabi|first2=Arman|last3=Lei|first3=Nuoa|last4=Smith|first4=Sarah|last5=Koomey|first5=Jonathan|date=2020-02-28|title=Recalibrating global data center energy-use estimates|url=https://science.sciencemag.org/content/367/6481/984|journal=Science|language=en|volume=367|issue=6481|pages=984–986|doi=10.1126/science.aba3758|issn=0036-8075|pmid=32108103}}</ref>. AI's current total impact can be estimated as a fraction of this, though growing extremely quickly. AI may itself offer efficiency gains for data centres by optimising control systems<ref>{{Cite web|url=https://blog.google/outreach-initiatives/environment/deepmind-ai-reduces-energy-used-for/|title=DeepMind AI reduces energy used for cooling Google data centers by 40%|date=2016-07-20|website=Google|language=en|access-date=2021-03-27}}</ref>.
 
Data centres accounted for almost 1% of global energy demand in 2019<ref>{{Cite web|url=https://www.iea.org/reports/data-centres-and-data-transmission-networks|title=Data Centres and Data Transmission Networks – Analysis|website=IEA|language=en-GB|access-date=2021-03-27}}</ref>, at around 200TWh, and while demand increases, efficiency gains mean this may stay flat for now<ref>{{Cite journal|last=Masanet|first=Eric|last2=Shehabi|first2=Arman|last3=Lei|first3=Nuoa|last4=Smith|first4=Sarah|last5=Koomey|first5=Jonathan|date=2020-02-28|title=Recalibrating global data center energy-use estimates|url=https://science.sciencemag.org/content/367/6481/984|journal=Science|language=en|volume=367|issue=6481|pages=984–986|doi=10.1126/science.aba3758|issn=0036-8075|pmid=32108103}}</ref>. AI's current total impact can be estimated as a fraction of this, though growing extremely quickly. AI may itself offer efficiency gains for data centres by optimising control systems<ref>{{Cite web|url=https://blog.google/outreach-initiatives/environment/deepmind-ai-reduces-energy-used-for/|title=DeepMind AI reduces energy used for cooling Google data centers by 40%|date=2016-07-20|website=Google|language=en|access-date=2021-03-27}}</ref>.
   
 
Estimates of electricity use of AI
 
Estimates of electricity use of AI
   
  +
=== Cloud comparison ===
* '''Cloud comparison''': The major cloud computing providers, Amazon, Google and Microsoft, have varying targets and carbon intensities for their services. Google now publishes hourly estimates of the proportion of carbon-free energy (CFE) and the carbon intensity for all its cloud regions<ref>{{Cite web|url=https://cloud.google.com/sustainability/region-carbon|title=Carbon free energy for Google Cloud regions|website=Google Cloud|language=en|access-date=2021-03-27}}</ref>.
 
  +
The major cloud computing providers, Amazon, Google and Microsoft, have varying targets and carbon intensities for their services.
* '''Energy buying'''
 
* '''Targets''': Several Cloud Providers have published CO2 emission targets:
 
** AWS, 100% renewable energy for its data centers by 2030, on track for 2025<ref>{{Cite web|url=https://blog.aboutamazon.co.uk/sustainability/amazon-becomes-the-worlds-largest-corporate-purchaser-of-renewable-energy|title=Amazon becomes the world’s largest corporate purchaser of renewable energy|date=2020-12-10|website=UK Day One Blog|language=en|access-date=2021-03-27}}</ref>
 
** Google, 24/7 carbon-free energy (real-time matching of supply and demand, without buying renewable generation certificates) by 2030<ref>{{Cite web|url=https://cloud.google.com/blog/topics/inside-google-cloud/announcing-round-the-clock-clean-energy-for-cloud/|title=Google Cloud aims for carbon-free energy for its data centers|website=Google Cloud Blog|language=en|access-date=2021-03-27}}</ref>.
 
** Microsoft, carbon negative by 2030<ref>{{Cite web|url=https://blogs.microsoft.com/blog/2020/01/16/microsoft-will-be-carbon-negative-by-2030/|title=Microsoft will be carbon negative by 2030|date=2020-01-16|website=The Official Microsoft Blog|language=en-US|access-date=2021-03-27}}</ref>.
 
   
 
Google now publishes hourly estimates of the proportion of carbon-free energy (CFE) and the carbon intensity for all its cloud regions<ref>{{Cite web|url=https://cloud.google.com/sustainability/region-carbon|title=Carbon free energy for Google Cloud regions|website=Google Cloud|language=en|access-date=2021-03-27}}</ref>.
=== Processing Units ===
 
  +
 
==== Energy buying ====
  +
  +
==== Targets ====
  +
 
AWS, 100% renewable energy for its data centers by 2030, on track for 2025<ref>{{Cite web|url=https://blog.aboutamazon.co.uk/sustainability/amazon-becomes-the-worlds-largest-corporate-purchaser-of-renewable-energy|title=Amazon becomes the world’s largest corporate purchaser of renewable energy|date=2020-12-10|website=UK Day One Blog|language=en|access-date=2021-03-27}}</ref>
  +
 
Google, 24/7 carbon-free energy (real-time matching of supply and demand, without buying renewable generation certificates) by 2030<ref>{{Cite web|url=https://cloud.google.com/blog/topics/inside-google-cloud/announcing-round-the-clock-clean-energy-for-cloud/|title=Google Cloud aims for carbon-free energy for its data centers|website=Google Cloud Blog|language=en|access-date=2021-03-27}}</ref>.
  +
 
Microsoft, carbon negative by 2030<ref>{{Cite web|url=https://blogs.microsoft.com/blog/2020/01/16/microsoft-will-be-carbon-negative-by-2030/|title=Microsoft will be carbon negative by 2030|date=2020-01-16|website=The Official Microsoft Blog|language=en-US|access-date=2021-03-27}}</ref>.
 
== Processing Units ==
 
GPUs
 
GPUs
   
Line 30: Line 49:
   
 
== Background Readings ==
 
== Background Readings ==
*''' Efforts to quantify the CO2 impact of ML''' <ref name=":0">{{Cite journal|last=Lacoste|first=Alexandre|last2=Luccioni|first2=Alexandra|last3=Schmidt|first3=Victor |last4=Dandres|first4=Thomas|date=2019-10-21|title=Quantifying the Carbon Emissions of Machine Learning|url=https://arxiv.org/abs/1910.09700|journal=arXiv:1910.09700 [cs, stat]}}</ref><ref>{{Cite journal|last=Henderson|first=Peter|last2=Hu|first2=Jieru|last3=Romoff|first3=Joshua|last4=Brunskill|first4=Emma|last5=Jurafsky|first5=Dan|last6=Pineau|first6=Joelle|date=2020-01-31|title=Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning|url=http://arxiv.org/abs/2002.05651|journal=arXiv:2002.05651 [cs]}}</ref>.
 
   
 
== Online Courses and Course Materials ==
 
== Online Courses and Course Materials ==
Line 37: Line 55:
   
 
== Libraries and Tools ==
 
== Libraries and Tools ==
*'''ML CO2 Impact''': A tool to calculate Machine Learning CO2 emissions, available [https://mlco2.github.io/impact/ here].
 
*'''Experiment Impact Tracker''': Anther CO2 emissions calculator providing information about power draw from CPU and GPU, hardware information, python package versions, estimated carbon emissions information, and in California realtime carbon emission information, available [https://github.com/Breakend/experiment-impact-tracker here].
 
*'''Microsoft Emissions Impact Dashboard''': A tool by Microsoft to track carbon emissions related to Microsoft cloud services usage, [https://www.microsoft.com/en-us/sustainability/emissions-impact-dashboard here]
 
* '''Codecarbon''': A software package that integrates into Python codebase to estimate the amount of carbon dioxide produced by the cloud or personal computing resources used to execute the code, [https://codecarbon.io/ here]
 
   
 
== Data ==
 
== Data ==
 
* '''Hugging Face Model Cards''': Several model cards of trained Hugging Face models report the amount of CO2 it took to train them, [https://huggingface.co/models?other=co2_eq_emissions here].
 
   
 
== References ==
 
== References ==
Please note that all contributions to Climate Change AI Wiki are considered to be released under the Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) (see Climate Change AI Wiki:Copyrights for details). If you do not want your writing to be edited mercilessly and redistributed at will, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource. Do not submit copyrighted work without permission!
Cancel Editing help (opens in new window)