Climate Impact Of AI: Difference between revisions

Content added Content deleted
No edit summary
No edit summary
Line 7: Line 7:
Many recent projects in deep learning have used increasingly large computational resources to create models[ x, y, z].
Many recent projects in deep learning have used increasingly large computational resources to create models[ x, y, z].


The compute resources used for the largest model training runs was shown to have doubled in size every 3.4 months from 2012-2018<ref>{{Cite web|url=https://openai.com/blog/ai-and-compute/|title=AI and Compute|date=2018-05-16|website=OpenAI|language=en|access-date=2021-03-27}}</ref>.
Compute doubling https://openai.com/blog/ai-and-compute/


==Training==
==Training==
Line 13: Line 13:


==Inference==
==Inference==
Some estimates show the vast majority of energy (80-90%) used by deep learning models is spent on inference<ref>{{Cite web|url=https://www.forbes.com/sites/moorinsights/2019/05/09/google-cloud-doubles-down-on-nvidia-gpus-for-inference/|title=Google Cloud Doubles Down On NVIDIA GPUs For Inference|last=Strategy|first=Moor Insights and|website=Forbes|language=en|access-date=2021-03-27}}</ref><ref>{{Cite web|url=https://youtu.be/ZOIkOnW640A?t=5327|title=AWS re:Invent 2018 - Keynote with Andy Jassy|last=Jassy|first=Andy|date=|website=YouTube|url-status=live|archive-url=|archive-date=|access-date=2021-03-27}}</ref>.


==References==
==References==