Climate Impact Of AI: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 1:
AI and machine learning has a climate impact from the energy used at all stages from development to deployment, as well as considerations around embodied energy in hardware from GPUs to data centres. The majority of emissions come from CO2 released from electricity generation powering AI processes. The carbon intensity of any particular project is highly dependent on its location and the electricity mix that it consumes.
AI and machine learning has a climate impact from the energy used at all stages from development to deployment, as well as considerations around embodied energy in hardware from GPUs to data centres. Efforts to quantify the CO2 impact of ML have been undertaken<ref name=":0">{{Cite journal|last=Lacoste|first=Alexandre|last2=Luccioni|first2=Alexandra|last3=Schmidt|first3=Victor |last4=Dandres|first4=Thomas|date=2019-10-21|title=Quantifying the Carbon Emissions of Machine Learning|url=https://arxiv.org/abs/1910.09700|journal=arXiv:1910.09700 [cs, stat]}}</ref><ref>{{Cite journal|last=Henderson|first=Peter|last2=Hu|first2=Jieru|last3=Romoff|first3=Joshua|last4=Brunskill|first4=Emma|last5=Jurafsky|first5=Dan|last6=Pineau|first6=Joelle|date=2020-01-31|title=Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning|url=http://arxiv.org/abs/2002.05651|journal=arXiv:2002.05651 [cs]}}</ref>, and tools and recommendations for best practice have been created<ref name=":1">{{Cite web|last=|first=|date=|title=ML CO2 Impact|url=https://mlco2.github.io/impact/|url-status=live|archive-url=|archive-date=|access-date=2021-03-27|website=}}</ref><ref>{{Cite web|url=https://github.com/Breakend/experiment-impact-tracker|title=Experiment Impact Tracker|date=2021-03-27|website=|url-status=live|archive-url=|archive-date=|access-date=}}</ref>.
 
AI and machine learning has a climate impact from the energy used at all stages from development to deployment, as well as considerations around embodied energy in hardware from GPUs to data centres. Efforts to quantify the CO2 impact of ML have been undertaken<ref name=":0">{{Cite journal|last=Lacoste|first=Alexandre|last2=Luccioni|first2=Alexandra|last3=Schmidt|first3=Victor |last4=Dandres|first4=Thomas|date=2019-10-21|title=Quantifying the Carbon Emissions of Machine Learning|url=https://arxiv.org/abs/1910.09700|journal=arXiv:1910.09700 [cs, stat]}}</ref><ref>{{Cite journal|last=Henderson|first=Peter|last2=Hu|first2=Jieru|last3=Romoff|first3=Joshua|last4=Brunskill|first4=Emma|last5=Jurafsky|first5=Dan|last6=Pineau|first6=Joelle|date=2020-01-31|title=Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning|url=http://arxiv.org/abs/2002.05651|journal=arXiv:2002.05651 [cs]}}</ref>, and tools and recommendations for best practice have been created<ref name=":1">{{Cite web|last=|first=|date=|title=ML CO2 Impact|url=https://mlco2.github.io/impact/|url-status=live|archive-url=|archive-date=|access-date=2021-03-27|website=}}</ref><ref>{{Cite web|url=https://github.com/Breakend/experiment-impact-tracker|title=Experiment Impact Tracker|date=2021-03-27|website=|url-status=live|archive-url=|archive-date=|access-date=}}</ref>.
 
== Deep Learning ==
Thoughtful blog from W&B: https://towardsdatascience.com/deep-learning-and-carbon-emissions-79723d5bc86e
 
===Models and training===
Many recent projects in deep learning have used increasingly large computational resources to create models[ x, y, z].
 
The compute resources used for the largest model training runs was shown to have doubled in size every 3.4 months from 2012-2018<ref>{{Cite web|url=https://openai.com/blog/ai-and-compute/|title=AI and Compute|date=2018-05-16|website=OpenAI|language=en|access-date=2021-03-27}}</ref>.
 
==Training=Inference===
Redundant training
 
==Inference==
Some estimates show the vast majority of energy (80-90%) used by deep learning models is spent on inference<ref>{{Cite web|url=https://www.forbes.com/sites/moorinsights/2019/05/09/google-cloud-doubles-down-on-nvidia-gpus-for-inference/|title=Google Cloud Doubles Down On NVIDIA GPUs For Inference|last=Strategy|first=Moor Insights and|website=Forbes|language=en|access-date=2021-03-27}}</ref><ref>{{Cite web|url=https://youtu.be/ZOIkOnW640A?t=5327|title=AWS re:Invent 2018 - Keynote with Andy Jassy|last=Jassy|first=Andy|date=|website=YouTube|url-status=live|archive-url=|archive-date=|access-date=2021-03-27}}</ref>.
 
=== Efficiency gains ===
Redundant training
 
Experiment tracking<ref>{{Cite web|url=https://towardsdatascience.com/deep-learning-and-carbon-emissions-79723d5bc86e|title=Deep Learning and Carbon Emissions|last=Biewald|first=Lukas|date=2019-06-24|website=Medium|language=en|access-date=2021-03-27}}</ref>
 
Smart hyperparameter selection
 
== Data Centres ==
Estimates of electricity use of AI
 
Cloud comparison
 
== Processing Units ==
GPUs
 
Development of other chip types - TPUs, IPUs etc
 
==References==
8

edits