
""I've been a computer scientist for 40 years, and for most of that time computing was the tiniest piece of our economy's power use," Chien told Fortune. "Now it's becoming a large share of what the whole economy consumes.""
""It's scary because computing was always the tiniest piece of our economy's power use," he said. "Now it could be 10% or 12% of the world's power by 2030. We're coming to some seminal moments for how we think about AI and its impact on society.""
""It's pretty amazing," Chien said. "A year-and-a-half ago they were talking about five gigawatts. Now they've upped the ante to 10, 15, even 17. There's an ongoing escalation.""
OpenAI and NVIDIA announced plans to build AI data centers consuming up to 10 gigawatts, with additional projects totaling 17 gigawatts already in motion. A single corporate project could consume more electricity daily than two American cities pushed to their breaking point. New York City uses about 10 gigawatts in summer, and San Diego exceeded 5 gigawatts during a 2024 heat wave. Experts warn computing's share of global power may rise to 10–12% by 2030, marking an escalation from earlier five-gigawatt estimates and raising concerns about grid strain and societal impact.
Read at Fortune
Unable to calculate read time
Collection
[
|
...
]