
The success of the Chatbots based on large language models fired his Energy requirement up to Levels comparable to the consumption of entire cities and countrieswhich is the energy consumption of ChatGPT for one year is equivalent to Madrid for more than seven months.
Chatbots based on large language models (LLM) have revolutionized the world of AI, making it accessible to anyone with just a smartphone or internet access, be it for solving questions or performing multiple daily tasks, from writing text to generating images to solving complex mathematical problems.
What started as a simple doubt-solving tool has evolved into advanced AI systems capable of reasoning, coding, and even decision-making. In this sense, OpenAI uses its assistant ChatGPT is one of the industry leaders with more than 700 million weekly active usersaccording to the company in September, a number that has currently risen to nearly 900 million, according to The Information.
This makes the chatbot one of the most popular applications among users worldwide and therefore receives millions of requests every day, which is reflected in high energy consumption that can even keep up with the consumption of cities, according to the analysis and trading portal BestBrokers.
This consumption is influenced by the progress of models, which increase their performance as their performance increases The energy demand increases at the same rate, This leads to high pressure on the power grids and concerns about the environmental costs of this technology.
Although it is difficult to estimate how much energy is required to complete a model’s inference process because it varies depending on the length of the query, the execution efficiency, or the number of users, research from the University of Rhode Island’s AI Laboratory has found this to be the case ChatGPT-5 consumes between 2 and 45 watts per average length message.
This corresponds to an average consumption of around 18.9 watt hours, which is already more than fifty times the energy of a typical Google search, as the company states based on a study by the Electric Energy Research Institute in the USA.
However, energy consumption goes even further and overshadows the high electricity needs of some European countries and cities. The company has specifically made this clear The annual consumption of ChatGPT alone amounts to around 17.3 terawatt hours of electricity per year.
Compared to Spain, where there is an average consumption of about 29 kWh per day per household, the energy used by ChatGPT could supply citizens for 23 days and 6 hours. Likewise in comparison to the consumption of Madrid could supply the city for around 7 months and 19 days.
The same thing happens with other relevant cities, such as: Paris (France), where it could supply the city for more than a month, London (United Kingdom), for more than five months and Berlin (Germany) for about one year and five months. Also in countries like Copenhagen (Denmark) and Amsterdam (Netherlands), ChatGPT consumption corresponds to more than 4 years.
Likewise, it is expected that this consumption could supply the United Kingdom for almost 20 days, Germany for 12 days and France for 13 days, according to calculations by BestBrokers.
To all this we must add that in order to maintain the continuous operation of the model, OpenAI has an estimated annual energy cost of $2.42 billiontaking current commercial electricity tariffs into account. All of this highlights the significant resource requirements for large-scale implementation of AI.
Alan Goldberg, data analyst at BestBrokers, estimates: “Daily operations consume electricity on a scale that rivals that of entire nations, while the computational effort required for training and reasoning reaches tens of gigawatt hours.”
In this sense, he emphasized that the improvements in the efficiency of the models are offset by the constant increase in global use and the “continuous expansion of the model parameters”.
Against this background, he brought to the table the need for “strict transparency” and “enforceable optimization standards”. Because without it, there is a risk that the rapid acceleration of AI “will impose significant burdens on infrastructure and the environment and raise urgent questions about the sustainability of this technological development,” said Goldberg.