Artificial intelligence tech is booming, but it comes at a huge cost i.e. soaring electricity usage.
According to NewYorker, OpenAI's famous chatbot, ChatGPT, gulps down over half a million kilowatt-hours of power each day. That's a whopping 17,241 times more than the average American home's daily consumption of just 29 kilowatt-hours (based on 2022 data).
Why does AI need so much juice? The computer systems and GPUs running these advanced AI models are incredibly energy-hungry. A single AI server can easily gobble up as much electricity as over a dozen UK households combined. No wonder the numbers add up alarmingly fast.
If AI capabilities like ChatGPT get integrated into massively popular services like Google Search, the energy drain could reach catastrophic levels. Data scientist Alex de Vries estimates Google would need around 29 billion kilowatt-hours per year - that's more than entire countries like Kenya use annually.
Calculating AI's total power usage isn't easy though. Tech giants driving the AI boom tend to keep energy data under wraps. Still, de Vries made a rough estimate. He used public figures from chipmaker Nvidia, which supplies around 95% of processors for AI work.
De Vries' analysis, published in the journal Joule, projects the whole AI industry could require between 85-134 terawatt-hours by 2027. For perspective, that accounts for up to 0.5% of global electricity consumption - just from AI alone!
As AI capabilities explode, so does the environmental cost. Tackling AI's ravenous energy needs must become a top priority. Sustainable practices and increased efficiency will be crucial to keep AI's emissions under control.
Image: DIW-Aigen
Read next: Ethical Concerns Rise as Google Fires Engineer Opposing Israeli Military Contract
by Asim BN via Digital Information World
No comments:
Post a Comment