One day of ChatGPT's energy use could run a US home for 46.5 years
ChatGPT, currently the most popular AI chatbot in the world consumes and incredible half million kilowatt-hours daily to handle its 200 million user requests
The rise of artificial intelligence(AI) has become the defining technology of the early 21st century, its use has become almost ubiquitous in the few years since it became publicly available.
But despite many of these services remaining free, there is a dangerous hidden cost to its use, and that is its insatiable hunger for electricity, says the Times of India.
ChatGPT, currently the most popular AI chatbot in the world consumes and incredible half million kilowatt-hours daily to handle its 200 million user requests, according to the New Yorker
This amounts to powering an average US household for 46.5 years, or 17,000 households for a single day.
This translates to over 17,000 times more electricity than the average US household uses in a day.
This is with a single AI chatbot, the first among a quickly growing number that see use everyday.
The dangerous numbers
What makes these figures more alarming is that the wider adoption of AI technology could lead to an even bigger energy drain.
A study by Alex de Vries, a data scientist for the Dutch National Bank, published in the journal Joule, suggests that if Google integrated generative AI into every search, it could consume a mind-boggling 29 billion kilowatt-hours annually.
This surpasses the yearly energy consumption of entire countries like Kenya, Guatemala, and Croatia.
"AI is just very energy intensive," de Vries said to Business Insider. "Every single AI server can already consume as much power as more than a dozen UK households combined."
Estimating the total energy consumption of the AI industry is challenging due to the varying operational needs of large models and the secrecy surrounding tech giants' energy usage, according to a report in The Verge.
However, de Vries, using data from chipmaker Nvidia, a leader in the AI boom, has come up with a projection.
By 2027, the entire AI sector could be using a staggering 85 to 134 terawatt hours annually, potentially reaching half a percent of global electricity consumption. This is significant considering major companies like Samsung use a fraction of this for their entire operations.
OpenAI has not yet commented on these reports.
The environmental impact of AI's energy needs is a growing concern. As AI development continues, addressing its energy consumption will be crucial to ensure a sustainable future.