Gen AI : Is Power Crisis Looming Large

Aditya Krishnan
4 min readAug 22, 2024

--

Grids & networks turning into a battlefield to manage the massive load

GenAI_power_Aditya_Krishnan_Digital_Medium
Image Source : www.gigabyte.com

Around the globe there are around eight thousand data centers, however it is not sufficient to keep up with the power/energy requirements of generative AI. Training one large language model can produce as much CO2 as the entire lifetime of five cars . Considering power is generated in accordance to the needs, with an aging grid it may increasingly be unable to handle the transmission to desired places . That’s why data centers are being built closer to where power is generated .

Demand has never been higher for these racks of powerful servers, feeding the internet’s insatiable appetite for computing in the cloud.

Thanks to the generative AI race, the demand for power to run data centers and cool them has hit the roof .

One ChatGPT query takes nearly ten times as much energy as that of a typical Google search, and as much energy as keeping a five watt LED bulb on for an hour. Generating an AI image can use as much power as charging your smartphone. On account of power generation as per needs, aging grid is increasingly unable to handle the load. During peak demand w.r.t summer, if the data centers don’t reduce their load, there could be a blackout.

Gen AI models consume tons of power — Chatgpt requires 10x that of Google Search . These enormous LLM’s powered by gargantuan servers require massive amounts of power to train, process and infer .

Focus on Sustainability and environmental impact cannot be mere lipservice . 33,000 households can be powered through the energy needed to operate a system like Chatgpt in a single day . The AI frenzy has data center demand rising 15 to 20% every year the situation may be same till 2030 . Data centers can contribute to 16% of total US power consumption by 2030, according to one report, up from just 2.5% before ChatGPT came into existence in 2022.

Firms that were running on premise servers are grappling with issues of handling the load and now see remote data centers as a sole option amidst the backdrop post covid wherein there was an immediate need to accelerate the shift towards the cloud .

For example , Microsoft’s emissions has surged to 30% from 2020 to 2024, due to data centers designed and optimized to support AI workloads. Power projects to conventional industries are put on hold inorder to prioritize the requirements to keep the AI applications up and running .

Shift towards renewable sources of energy is on the anvil however a comprehensive roadmap is the need of the hour . Tech firms are exploring options to generate power on their own to run and operate the data centers under their purview . Open AI , Google are already striking partnerships to explore the alternative forms of power generation and have started their work in this direction . Even data centers themselves are starting to generate their own power to reduce dependency on conventional sources .

In an area of northern Virginia known ,data center servers process an estimated 70% of the world’s internet traffic each day. At one point in 2022, the power company there had to pause new data center connections as it struggled to keep up with demand. Basically, during the peak hour, they either ask the the residents to turn off their AC, or ask the AI company to stop their training. Load-shedding seems to become a standard norm and it is no more an exception .

The bottleneck often occurs in getting power from the site where it’s generated to where it’s consumed. One solution is to add hundreds or even thousands of miles of new transmission lines. Another solution is to use predictive software to reduce failures at one of the grid’s weakest points, the transformers.

A large reason we need more power and a more reliable grid for generative AI is to keep its servers cool. All the servers generate an immense amount of hot air, and cooling them down with air or water keeps them from overheating so they can keep running in an ubiquitous manner .

AI is projected to withdraw more water annually by 2027 than four times all of Denmark. There are technologies that are used in certain parts of the industry that do consume water in order to cool data centers through evaporative cooling.

The key alternative to those power hungry AMD x86 cores are ARM based specialized processors. ARM is making low powered chips that maximizes the battery life of early mobile phones.

With the advent of Gen AI , focus on sustainability and environment should not take a backseat . Proliferation of AI enabled applications and investments on AI are powering the growth , witnessing a surging demand on various aspects whilst the challenges of the basic essentials still remains paramount .

Topics like Sustainability and environment protection cannot merely remain on paper , it is time to wake up and smell the coffee as we ride the Gen AI wave .

About The Author :

Aditya Krishnan is a Senior Digital Strategist & Consultant with rich experience across different sectors and practices at a global level. He has authored several publications on Digital Marketing , GenAI, Analytics, Transformation, Technology etc across portals . He is also invited as a guest speaker by leading B-schools, technology institutes and in industry events . As a social initiative has developed a knowledge platform ‘Insights From Aditya’ on Linkedin to share latest insights on various digital, technology and marketing topics . He is a reputed author , poet and an avid researcher . Aditya has authored a book “ A string of pearls” a collection of best set of poems .

Connect on Linkedin : https://www.linkedin.com/in/adityasrkrishnan

Follow on Twitter : https://twitter.com/adityaskrishnan

--

--

Aditya Krishnan

Sr Digital Strategist & speaker on trends in marketing, technology . Author of the book " A String Of Pearls" . https://www.linkedin.com/in/adityasrkrishnan