Interactive Visualization

AI Compute Growth Simulator

Explore the implications of exponential AI infrastructure growth. Global AI compute capacity is doubling every ~7 months.

Inspired by research from EPOCH AI
3.3× per year
7mo doubling time
15M+ H100e capacity

Simulation Parameters

Aggressive (3mo) Conservative (24mo)
1 year 10 years
1M 50M
$15k $40k
300W 1000W

Key Metrics

32× Total Growth
3.3× Annual Growth Rate
$12T Cumulative Investment
480 GW Peak Power Draw
4,800 Datacenters (100MW)

Historical AI Compute Capacity (Epoch AI Data)

Cumulative global AI compute in H100-equivalents by manufacturer

Latest Total --
Observed Doubling Time --
Data Range Q1 2022 - Q3 2025

Compute Capacity Growth

Scenario Comparison

Infrastructure Investment

Power Consumption

Real-World Implications

Nuclear Plants Equivalent

480

1GW nuclear plants needed to power AI infrastructure

Concurrent GPT-4 Training

48,000

Simultaneous frontier model training runs possible

Global Electricity Share

1.6%

Of world electricity consumption

CO₂ Emissions

1.2B tons

Annual carbon footprint (at global grid average)

Key Insights

• At current growth rates, compute capacity will reach 480M H100e units in 5 years

• This represents a 32× increase from today's capacity

• Annual growth rate: 3.3× per year

• Total infrastructure investment needed: $12T

• Power requirements: 336 GW (equivalent to 336 nuclear power plants)

• Enables approximately 48,000 concurrent large model training runs

• Requires building 3,360 new hyperscale AI datacenters

AI Chip Landscape (Current)

Based on Epoch AI research - compute capacity in H100-equivalents

Nvidia

B300
B200
H100/H200
Other Nvidia

Google

TPU v7
TPU v6e
TPU v5e

Others

AMD MI300X
Amazon Trainium
Huawei Ascend