Energy and AI

Energy and AI

54 bookmarks
Custom sorting
A cheat sheet for why using ChatGPT is not bad for the environment
A cheat sheet for why using ChatGPT is not bad for the environment
I think a lot of people don’t realize how much water we each use every day. Almost all electricity generation involves heating water to create steam to spin a turbine.
When I hear people say “50 ChatGPT searches use a whole bottle of water!” I think they’re internally comparing this to the few times a year they buy a bottle of water. That makes ChatGPT’s water use seem like a lot. They’re not comparing it to the 1200 bottles of water they use every single day in their ordinary lives.
Each ChatGPT prompt uses between 10-25 mL of water if you include the water cost of training, the water cost of generating the electricity used, and the water used by the data center to cool the equipment. This means that every single day, the average American uses enough water for 24,000-61,000 ChatGPT prompts.
ChatGPT and other AI chatbots are extremely, extremely small parts of AI’s energy demand. Even if everyone stopped using all AI chatbots, AI’s energy demand wouldn’t change in a noticeable way at all. The data implies that at most all chatbots are only using 1-3% of the energy used on AI.
I have a similar reaction to the 10x a Google search point. When someone says “ChatGPT uses 10x as much energy as a Google search” I’m sometimes tempted to just say “Yes… 10 Google searches.” and just let that hang. Imagine going back to 2020 and saying “Oh man, I thought my buddy cared about the climate, but I just found out he… oh man I can’t bring myself to say it… he searched Google TEN times today.”
·andymasley.substack.com·
A cheat sheet for why using ChatGPT is not bad for the environment
Explained: Generative AI’s environmental impact
Explained: Generative AI’s environmental impact
Researchers have estimated that a ChatGPT query consumes about five times more electricity than a simple web search.
“What is different about generative AI is the power density it requires. Fundamentally, it is just computing, but a generative AI training cluster might consume seven or eight times more energy than a typical computing workload,” says Noman Bashir, lead author of the impact paper, who is a Computing and Climate Impact Fellow at MIT Climate and Sustainability Consortium (MCSC) and a postdoc in the Computer Science and Artificial Intelligence Laboratory (CSAIL). Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, partly driven by the demands of generative AI. Globally, the electricity consumption of data centers rose to 460 terawatts in 2022. This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia (371 terawatts) and France (463 terawatts), according to the Organization for Economic Co-operation and Development.
Chilled water is used to cool a data center by absorbing heat from computing equipment. It has been estimated that, for each kilowatt hour of energy a data center consumes, it would need two liters of water for cooling
·news.mit.edu·
Explained: Generative AI’s environmental impact
The multifaceted challenge of powering AI
The multifaceted challenge of powering AI
And data centers do consume huge amounts of electricity. U.S. data centers consumed more than 4 percent of the country’s total electricity in 2023, and by 2030 that fraction could rise to 9 percent, according to the Electric Power Research Institute. A single large data center can consume as much electricity as 50,000 homes.
Google recently ordered a fleet of SMRs to generate the power needed by its data centers. The first one will be completed by 2030 and the remainder by 2035.
·news.mit.edu·
The multifaceted challenge of powering AI