ChatGPT AI Emits Metric Tons of Carbon, Stanford Report Says - 0 views
-
A new report released today by the Stanford Institute for Human-Centered Artificial Intelligence estimates the amount of energy needed to train AI models like OpenAI’s GPT-3, which powers the world-famous ChatGPT, could power an average American’s home for hundreds of years. Of the three AI models reviewed in the research, OpenAI’s system was by far the most energy-hungry.
-
“If we’re just scaling without any regard to the environmental impacts, we can get ourselves into a situation where we are doing more harm than good with machine learning models,” Stanford researcher Peter Henderson said last year. “We really want to mitigate that as much as possible and bring net social good.”
-
OpenAI’s model reportedly released 502 metric tons of carbon during its training. To put that in perspective, that’s 1.4 times more carbon than Gopher and a whopping 20.1 times more than BLOOM. GPT-3 also required the most power consumption of the lot at 1,287 MWh.
- ...2 more annotations...
-
If all of this sounds familiar, it’s because we basically saw this same environmental dynamic play out several years ago with tech’s last big obsession: Crypto and web3. In that case, Bitcoin emerged as the industry’s obvious environmental sore spot due to the vast amounts of energy needed to mine coins in its proof of work model. Some estimates suggest Bitocin alone requires more energy every year than Norway’s annual electricity consumption.
-
rs of criticism from environmental activists however led the crypto industry to make some changes. Ethereum, the second largest currency on the blockchain, officially switched last year to a proof of stake model which supporters claim could reduce its power usage by over 99%. Other smaller coins similarly were designed with energy efficiency in mind. In the grand scheme of things, large language models are still in their infancy and it’s far from certain how its environmental report card will play out.