AI Compute Growth Simulator
Explore the implications of exponential AI infrastructure growth. Global AI compute capacity is doubling every ~7 months.
Simulation Parameters
Key Metrics
Historical AI Compute Capacity (Epoch AI Data)
Cumulative global AI compute in H100-equivalents by manufacturer
Compute Capacity Growth
Scenario Comparison
Infrastructure Investment
Power Consumption
Real-World Implications
Nuclear Plants Equivalent
480
1GW nuclear plants needed to power AI infrastructure
Concurrent GPT-4 Training
48,000
Simultaneous frontier model training runs possible
Global Electricity Share
Of world electricity consumption
CO₂ Emissions
1.2B tons
Annual carbon footprint (at global grid average)
Key Insights
• At current growth rates, compute capacity will reach 480M H100e units in 5 years
• This represents a 32× increase from today's capacity
• Annual growth rate: 3.3× per year
• Total infrastructure investment needed: $12T
• Power requirements: 336 GW (equivalent to 336 nuclear power plants)
• Enables approximately 48,000 concurrent large model training runs
• Requires building 3,360 new hyperscale AI datacenters
AI Chip Landscape (Current)
Based on Epoch AI research - compute capacity in H100-equivalents