OpenAI will not disclose GPT-5's energy use. It could be higher than past models | Open... - 0 views
-
dr tech on 10 Aug 25""A more complex model like GPT-5 consumes more power both during training and during inference. It's also targeted at long thinking … I can safely say that it's going to consume a lot more power than GPT-4," said Rakesh Kumar, a professor at the University of Illinois, currently working on the energy consumption of computation and AI models. The day GPT-5 was released, researchers at the University of Rhode Island's AI lab found that the model can use up to 40 watt-hours of electricity to generate a medium-length response of about 1,000 tokens, which are the building blocks of text for an AI model and are approximately equivalent to words."