Climate Impact Of AI

AI and machine learning has a climate impact from the energy used at all stages from development to deployment, as well as considerations around embodied energy in hardware from GPUs to data centres. The majority of emissions come from CO2 released from electricity generation powering AI processes. The carbon intensity of any particular project is highly dependent on its location and the electricity mix that it consumes.

Efforts to quantify the CO2 impact of ML have been undertaken, and tools and recommendations for best practice have been created.

Models and training
Many recent projects in deep learning have used increasingly large computational resources to create models.

The compute resources used for the largest model training runs was shown to have doubled in size every 3.4 months from 2012-2018.

Inference
Some estimates show the vast majority of energy (80-90%) used by deep learning models is spent on inference.

Efficiency gains
Redundant training

Experiment tracking

Smart hyperparameter selection

Data Centres
Data centres accounted for almost 1% of global energy demand in 2019, at around 200TWh, and while demand increases, efficiency gains mean this may stay flat for now. AI's current total impact can be estimated as a fraction of this, though growing extremely quickly. AI may itself offer efficiency gains for data centres by optimising control systems.

Estimates of electricity use of AI

Cloud comparison
The major cloud computing providers, Amazon, Google and Microsoft, have varying targets and carbon intensities for their services.

Google now publishes hourly estimates of the proportion of carbon-free energy (CFE) and the carbon intensity for all its cloud regions.

Targets
AWS, 100% renewable energy for its data centers by 2030, on track for 2025

Google, 24/7 carbon-free energy (real-time matching of supply and demand, without buying renewable generation certificates) by 2030.

Microsoft, carbon negative by 2030.

Processing Units
GPUs

Development of other chip types - TPUs, IPUs etc