ChatGPT's power consumption: ten times more than Google's
AI is resource-hungry. A study shows just how much. Every ChatGPT request costs ten times as much energy as a Google search.
(Image: photoschmidt/ Shutterstock.com)
"Artificial intelligence's hunger for electricity continues to grow, putting a strain on power grids, increasing greenhouse gas emissions and exacerbating environmental problems," according to a study by BestBrokers, which took a closer look at ChatGPT's energy consumption. A single request to the AI chatbot is said to consume 2.9 watt hours. This is ten times more power consumption than a regular Google search query –, which consumes 0.3 watt hours.
Videos by heise
Over a whole year, ChatGPT currently requires 226.82 million watt hours – just to answer user queries. According to the researchers, this results in costs of around 30 million US dollars. This is based on the assumption of 100 million active users per week, each making around 15 requests per week. Of course, this is not a particularly high number of requests. The Electric Power Research Institute (EPRI) has calculated that one request requires 0.0029 kilowatt hours.
ChatGPT compared to streaming, cell phones and countries
This also means that the number of daily requests is estimated at around 214 million. These then require half a million kilowatt hours of energy. That in turn is around 22,000 times as much as an average US household consumes (around 29 kWh per day). According to the study, this also means: "With the energy consumed by ChatGPT in one year, around 3.13 million electric vehicles with an average battery capacity of 72.4 kWh could be fully charged. That's almost 95% of all electric cars on the road in the US by the end of 2023." The authors make further comparisons: According to them, ChatGPT's annual consumption is equivalent to around 140,000 hours of video streaming and charging almost 50 million smartphones every day for an entire year. Last but not least: "ChatGPT's energy consumption for processing requests exceeds the total electricity consumption of twelve small countries and territories, including Gibraltar, Grenada, Dominica, Samoa and the British Virgin Islands. It could even power the whole of Finland or Belgium for a whole day."
The authors of the study are well aware that requests to ChatGPT can vary greatly in length and therefore energy costs. They say that rough calculations of consumption are nevertheless possible. Functions such as ChatGPT's new voice mode have not been taken into account and could significantly increase consumption.
(Image: Best Brokers)
In addition to the usage costs, there are the costs for developing the AI models and services. Best Brokers assumes that the training of GPT-4 took 100 days and consumed 62,318,800 kWh. This corresponds to costs of USD 8.2 million – for energy consumption alone. Handelsblatt and other sources estimate that the development of GPT-4 cost a total of 78 million US dollars, while Sam Altman, CEO of OpenAI, even spoke of more than 100 million US dollars , according to Wired.
According to a calculation by Stanford University, the most expensive model is Google's Gemini Ultra, which is said to have cost USD 191,400,000 to develop. Meta's Llama 2 is therefore a low 3.9 million US dollars.
(emw)