Data centers to train and run AI may soon contain millions of chips, cost hundreds of billions of dollars, and require power equivalent to a large city’s electricity grid, if the current trends hold, according to a new study from researchers at Georgetown, Epoch AI, and Rand. The co-authors compiled and analyzed a dataset of over 500 AI data center projects and found that, while the computational performance of data centers is more than doubling annually, so are the power requirements and capital expenditures. The findings illustrate the challenge in building the necessary infrastructure to support the development of AI technologies in the coming decade. According to the Georgetown, Epoch, and Rand study, the hardware costs for AI data centers like xAI’s Colossus, which has a price tag of around $7 billion, increased 1.9x each year between 2019 and 2025, while power needs climbed 2x annually over the same period. The study also found that data centers have become much more energy efficient in the last five years, with one key metric — computational performance per watt — increasing 1.34x each year from 2019 to 2025. Yet these improvements won’t be enough to make up for growing power needs. By June 2030, the leading AI data center may have 2 million AI chips, cost $200 billion, and require 9 GW of power — roughly the output of nine nuclear reactors.