In the glitzy world of tech, Nvidia is making headlines not just for its cutting-edge H100 AI GPUs but also for the mind-boggling power they consume. These GPUs, which boast a peak power consumption of 700W, are forecasted to collectively guzzle up more electricity than the average American household, with estimates suggesting their impact will rival that of a major American city or even surpass the combined consumption of small European countries! Paul Churnock, a luminary at Microsoft, predicts that by late 2024, when millions of these GPUs will be in operation, their combined consumption will surpass that of all households in Phoenix, Arizona. By then, these power-hungry contraptions might just clock in as the fifth largest electricity consumer in the U.S., nimbly squeezing between the energy appetites of Houston and Phoenix.
However, amidst the power consumption histrionics, there’s an important kernel of optimism. The relentless march of progress in AI and HPC GPU efficiency is poised to temper these staggering energy demands. Even as Nvidia’s upcoming Blackwell-based B100 threatens to outpace the voracious power appetite of the H100, it promises to offer superior performance, resulting in more work done for each unit of power consumed.