Chatgpt, the Openai chatbot platform, may not be as hungry as after assumption. But his appetite largely depends on how chatgpt and AI models are used, which respond to queries according to the up-to-date study.
AND Last analysis By Epoch AI, non -profit AI Research Institute, he tried to calculate how much energy the typical chatgpt query uses. AND commonly cited status Does chatgpt require about 3 watt hours of power to answer one question, i.e. 10 times more than searching on Google.
Epoch thinks it’s overwhelming.
Using the latest default Openai model for ChatgPT, GPT-4O, as reference, EPoch stated that the average chatgpt inquiry consumes about 0.3 watt hours-at least many household appliances.
“Energy consumption is not really great compared to the use of normal devices, heating or cooling the home or driving a car,” said Joshua, a data analyst at EPoch, who conducted the analysis.
AI energy consumption – and its impact on the environment, generally speaking – is the subject of controversial debate when AI want to quickly expand their traces of infrastructure. A group of over 100 organizations last week published an open letter Calling the AI industry and regulatory bodies to ensure that the up-to-date AI data centers do not exhaust natural resources and does not force the tools to rely on irritable energy sources.
You said Techcrunch that his analysis was stimulated by what he characterized as antiquated previous tests. For example, you returned that the author of the report, which came in the estimates of 3 watt hours, assumed that Opeli used older, less capable systems to run his models.
“I saw a lot of public discourse, which correctly decided that AI would consume a lot of energy in the coming years, but did not really describe the energy that went to AI,” you said. “In addition, some of my colleagues noticed that the most frequently reported estimated 3 -hour estimation were based on quite old studies, and on the basis of mathematics on the napkin they seemed too high.”
It is true that the number of epoch 0.3 watt-hours is also a approximation; Opeli did not publish the details needed to make precise calculations.
The analysis also does not include additional energy costs incurred by chatgpt, such as image production or input processing. You decided that CHATGPT queries “long entry” – for example, queries with long files – probably consume more electricity than a typical question.
However, you said that it expected that the output energy consumption of chatgpt will boost.
“[The] AI will be more advanced, training this artificial intelligence will probably require much more energy, and this future AI can be used much more intensively – it supports much more tasks and more complex tasks than how people use chatgpt today, “you said.
Although in recent months an unusual breakthrough in the field of artificial intelligence efficiency has appeared, the scale in which AI is implemented will affect the huge, hungry energy extension of the infrastructure. Over the next two years, AI data centers may need almost all power of 2022 California (68 GW), According to the RAND report. The report provided that by 2030 training of the border model may require equivalent energy from eight nuclear reactors (8 GW).
Chatgpt itself reaches a huge – and expanding – the number of people, thanks to which its server requirements are similarly huge. Openai together with several investment partners plans to spend billions of dollars on up-to-date projects of the AI data center in the next few years.
Openai’s attention – along with the rest of the AI industry – also goes to the reasoning models, which are usually more capable in terms of tasks that they can do, but require more processing. Unlike models such as GPT-4O, which react to queries almost immediately, the reasoning of the models “think” for a few seconds to minutes before the answer, the process that sucks more calculations-the same power.
“Models of reasoning will increasingly take up tasks that older models cannot and generate more [data] To do this and both require more data centers – you said.
Opeli began to free more energy-saving reasoning models, such as O3-Mini. But it seems unlikely, at least at the moment that the boost in efficiency will balance the increased power requirements thanks to the “thinking” of reasoning models and the growing employ of artificial intelligence around the world.
You suggested that people worrying about their energy trace AI employ applications such as chatgPT, or choose models that minimize the necessary processing – in the realistic range.
“You can try to use smaller AI models such as [OpenAI’s] GPT-4-mini-you said, “and employ them sparingly in a way that requires processing or generating a lot of data.”