ChatGPT AI Emits Metric Tons of Carbon, Stanford Report Says - 0 views
-
A new report released today by the Stanford Institute for Human-Centered Artificial Intelligence estimates the amount of energy needed to train AI models like OpenAI’s GPT-3, which powers the world-famous ChatGPT, could power an average American’s home for hundreds of years. Of the three AI models reviewed in the research, OpenAI’s system was by far the most energy-hungry.
-
OpenAI’s model reportedly released 502 metric tons of carbon during its training. To put that in perspective, that’s 1.4 times more carbon than Gopher and a whopping 20.1 times more than BLOOM. GPT-3 also required the most power consumption of the lot at 1,287 MWh.
-
“If we’re just scaling without any regard to the environmental impacts, we can get ourselves into a situation where we are doing more harm than good with machine learning models,” Stanford researcher Peter Henderson said last year. “We really want to mitigate that as much as possible and bring net social good.”
- ...2 more annotations...