Nvidia's solution to the AI energy problem is mini data centers next to local power substations and, of course, selling even more GPUs
That something as superficially straight forward as plain old energy supply is turning out to be a major limiting factor in the AI revolution is one of those delightful unexpected quirks that keeps us all on our toes. Who'd have thought in just a few short years the world would be spooling up more GPUs than the power grid can cope with?
One possible partial solution, apparently, is moving those GPUs nearer to the actual power supply. This is something Nvidia and a gang of collaborators plan to pilot later this year in the form of roughly 25 small data centers located next to power substations in the USA.
Now, at first glance, you might think this is a bit of a zero sum game. How would putting GPUs nearer the actual power supply reduce the amount of power required?
Nope, it's not reduced losses from shorter cables or anything like that. Indeed, it's not about reducing power consumption at all. Instead, the idea is load balancing.
In other words, across a group of substations power load will vary, some experiencing heightened demand, others less so. What Nvidia and its partners plan to do is ramp up compute at the small data centers near substations under lesser load while trimming it back at stations under heavier load.

It also allows spare capacity of, say, five megawatts, available from a local substations to be used that wouldn't normally be even close to sufficient for larger data centers. “There are 55,000 substations in the U.S., and if they each have five, 10, or 20 megawatts of spare capacity, that number adds up pretty fast,” says Marc Spieler, senior director of energy at Nvidia.
Call me a cynic, but it's easy to see how Nvidia thinks this is a super idea. After all, it inherently involves building even more GPUs than you'd normally need for a given level of compute performance from a large monolithic data center. All just to build in the redundancy required to be able to spool up some of the smaller data centers while winding down others.
In short, it's funny how just about every solution when it comes to AI involves Nvidia selling even more GPUs. Kerching!
Anywho, according to the EPRI (Electric Power Research Institute), data centers could be guzzling up to 17% of US electricity generation by 2030, more than double current usage. So something will have to be done to accommodate that huge spike in power consumption. Likely the only certainly is that whatever the solution, it'll involve more Nvidia GPUs.
Схожі новини
Procrastination, productivity and inspiration: how research is like designing video games
Ахметова подвел курс гривни. Энергохолдинг ДТЭК ВИЭ сменил миллиардную прибыль на «зеркальные» убытки
Лиш недавно виготовлена: Зеленський сказав, яка ракета влучила по 9-поверхівці в Києві