Ooh! This is interesting. I still need to read his whole piece. I looked at the graph, and there in the middle is "asking chatGPT a question." Does he say how he came up with the data for this statistic? Because I was reading somewhere recently (and I wish I knew where) that being polite to AI is a huge waste of energy; adding three words to the beginning of a question "Would you please..." takes 50ml more water (was it really that much?) than leaving them off. So, tho he quantifies some other statistics with measurements, the question that is asked of chatGPT is unquantified.
In addition, his chart is in units of one. If comparing "asking chatGPT a question" to "10 min 4k video," there are simply a lot of differences in metrics. I mean, how many questions does this person ask chatGPT in 10 minutes? I can type fast! Shouldn't they be compared that way? And then there's the water usage of generative AI (asking for a composition) vs. simply asking for data. These require vastly different amounts of water, yet he doesn't quantify them in the chart.
But I'll search for that article I read that I thought had good quantitative measurements on water for input & output. Or maybe there's better information on it somewhere. I like reliable reference to primary resource for statistical data to enable me to check up on how others are using their statistics, so if that can be found, it would be awesome.
I don't think it was that one, as I do believe that the one I read had some metrics. On the other hand, the likelihood that it was in the NYTimes is high, so it could have been that one. However, reading your NYT article (and having thought for a while) sent me down a rabbit hole that made me realize what I dislike about the original article posted. Namely, it's an opinion piece posing as fact: the author is trying to make me believe his opinion is a fact. As humans, we generally enjoy reading articles that reinforce our own opinions. While writing opinion pieces can be pleasurable and give us new ideas, I do not think that it is appropriate to use an opinion article to discuss the *facts* of the matter. It IS appropriate to use opinion articles to discuss opinions.
If you're saying that the original piece (by Masley) is an "opinion" piece, I'm happy to go along with it — I try not to debate definitions — but I'm not sure what I should take from that. Are you saying that (say) his numbers are incorrect, or that (say) he's making apples-to-oranges comparisons? I'm also happy to grant that his essay is imperfect, because (of course) every essay is. But then could you perhaps point to another essay (or source) that you think has better data (especially if they push in the opposite direction)?
Yes and yes. It is an opinion piece: "Why using ChatGPT is not bad for the environment." I mean, that's his opinion. There are too many factors that need to be defined. A purely factual piece would be titled something like, "Energy use by ChatGPT and it's effects on the environment." As for apples and oranges, yes. I'm not saying his numbers are incorrect, but I am saying it's important to know what it is that I'm asking ChatGPT to do, as different types of tasks require vastly different amounts of energy https://www.bernstein.com/our-insights/insights/2023/articles/ais-next-big-challenge.html For this reason, I don't think this is quite the best article to use for a fact-based discussion. (INSERT LINE BREAK) If you want MY opinion, which is TOTALLY my opinion, I really can't understand why people have gotten so obsessed over the energy use of this particular search engine. There are SO many ways that we waste energy, and I find energy waste achingly painful. When we look at the big picture, energy use due to AI is expected to rise 20% by 2030. https://www.cnbc.com/2024/05/05/ai-could-drive-natural-gas-boom-as-utilities-face-surging-electric-demand.html
This is a huge amount! (investor's note: invest in natural gas!) (environmentalist's note: protest hydro-fracking!) (human note: aaargh!) I think that the main reason why we, as a society, are so worried about the energy use by AI search engines is that so many of us are unaware of the AI integrated into virtually every aspect of our modernized society. It's a chore just to figure out how to turn off the AI search on Google. AI is absolutely everywhere, and we cannot avoid it. AI may or may not be utilized in other tasks on the chart. How do we even know when we are using AI? (INSERT LINE BREAK)
Does this 20% increase in energy use disturb me? Undoubtedly. Do I think I can change it by refusing to use AI search engines? Not personally, but I do my best anyhow. I think there are better things worth arguing about. I mean, why is cryptocurrency even legal? (argumentative tactic: changing the focus of the argument) Yikes! Talk about massive energy use that provides absolutely no social benefit. AI search engines can both stand in as psychologists and perform mundane tasks of aligning criteria between diverse topics. Like prescription drugs, social media, guns and hammers, AI is a controversial tool that can be overused, mis-used, or used in an *appropriate* manner to achieve a goal. (INSERT LINE BREAK) IMO, what we need to think about as a society (if we can think as a society) is: How can we most efficiently use this new tool to achieve our goals? When should it be applied? When should it be banned? I have the impression that's probably part of what you're trying to figure out, in terms of education. It's a necessary question.
A back of the envelope calculation to give another estimate for the LLM consumption including training:
Lets assume the H100 gpu represents most of the AI companies' compute.
Each is 700 watt max TDP, so (x8760 hour in the year) 6 million watt hours per year. Times 2 millions H100 sold in 2024 worldwide --> 12000 gigawatt hour.
(The GPUs are used 100% all the time, either for inferring or for training new models).
OpenAI has 500 millions "users" (couldn't find good sources for those, but that number keeps coming up). It's pretty likely that most people using LLM must have made an openAI account at one point.
So 12000gigawatthour divided by 500million users of LLM so 24000watt hour per user.
That's about equal to running a kitchen oven for 15 hours. Which is much more than I expected, but is still negligable when compared to ordering on amazon, eating meat or driving a car.
LLM has plenty of problems, being bad for the environment isn't one of them (yet).
I'll consider limiting my LLM use (which i do enjoy) if we've considered stopping those much worse other things before.
If I want to cut a tree down, chop it into pieces, and build a bonfire as a form of evening leisure, I could reasonably say "there are 3 trillion trees on earth! Each time I do this I'm cutting down 0.0000000001% of the trees on earth, my actions are totally inconsequential."
If every human wants to do exactly that every night, all of the trees on earth will be gone in a little over a year.
To me, quantifying an individual's energy use per query or whatever is a distraction from the fact that this technology has become very popular very quickly, and the cumulative impact of that popularity has and will continue to have a very large effect on our energy use as a species.
And to me, this is different from, i.e. using water for indoor plumbing (a clear public health and quality of life benefit) or using energy to do any number of other things. We could get rid of LLMs tomorrow and it wouldn't have a large effect on the well-being of the vast majority of humans on earth. I can't think of another technology with a similar total energy / total benefit to humans ratio.
I am honestly not very concerned about per query effects. I suspect (though admittedly need to dig up better data) that the biggest "expense" is in both the building the physical infrastructures and training the actual models (including all the ones created but not actually deployed, many versions of each, etc). Ignoring this, sure, queries may not be that big a deal. But that's a huge piece to ignore (and as with many of the world's biggest problems, not something a single individual's behavior will affect much at this point).
The proof of the pudding is in the eating, and the corporations platforming the AI pudding we are eating, not just companies like ChatGPT but Google and Microsoft, are reporting a huge impact on their ability to reach their sustainability goals because of AI. Google and Microsoft were on pace to reach their 2030 net zero emissions goals and now they're not and they both cite AI as the main culprit. Could they be lying or exaggerating? It's possible.
This seems like a weird take to me, because asking the question is supposed to be the least energy intensive part as far as I knew. I thought the most energy intensive part was training the AI before it ever gets to questions. So it seems like it becomes more energy efficient the more questions we ask after training.
No citation, sorry. I'd be curious if someone has the numbers.
Ooh! This is interesting. I still need to read his whole piece. I looked at the graph, and there in the middle is "asking chatGPT a question." Does he say how he came up with the data for this statistic? Because I was reading somewhere recently (and I wish I knew where) that being polite to AI is a huge waste of energy; adding three words to the beginning of a question "Would you please..." takes 50ml more water (was it really that much?) than leaving them off. So, tho he quantifies some other statistics with measurements, the question that is asked of chatGPT is unquantified.
In addition, his chart is in units of one. If comparing "asking chatGPT a question" to "10 min 4k video," there are simply a lot of differences in metrics. I mean, how many questions does this person ask chatGPT in 10 minutes? I can type fast! Shouldn't they be compared that way? And then there's the water usage of generative AI (asking for a composition) vs. simply asking for data. These require vastly different amounts of water, yet he doesn't quantify them in the chart.
But I'll search for that article I read that I thought had good quantitative measurements on water for input & output. Or maybe there's better information on it somewhere. I like reliable reference to primary resource for statistical data to enable me to check up on how others are using their statistics, so if that can be found, it would be awesome.
I'm not sure if this is the same one, but I stopped saying, "please" to chat gpt after I read this a few months back: https://www.nytimes.com/2025/04/24/technology/chatgpt-alexa-please-thank-you.html
It's a hard habit to break, LOL!
I don't think it was that one, as I do believe that the one I read had some metrics. On the other hand, the likelihood that it was in the NYTimes is high, so it could have been that one. However, reading your NYT article (and having thought for a while) sent me down a rabbit hole that made me realize what I dislike about the original article posted. Namely, it's an opinion piece posing as fact: the author is trying to make me believe his opinion is a fact. As humans, we generally enjoy reading articles that reinforce our own opinions. While writing opinion pieces can be pleasurable and give us new ideas, I do not think that it is appropriate to use an opinion article to discuss the *facts* of the matter. It IS appropriate to use opinion articles to discuss opinions.
If you're saying that the original piece (by Masley) is an "opinion" piece, I'm happy to go along with it — I try not to debate definitions — but I'm not sure what I should take from that. Are you saying that (say) his numbers are incorrect, or that (say) he's making apples-to-oranges comparisons? I'm also happy to grant that his essay is imperfect, because (of course) every essay is. But then could you perhaps point to another essay (or source) that you think has better data (especially if they push in the opposite direction)?
Yes and yes. It is an opinion piece: "Why using ChatGPT is not bad for the environment." I mean, that's his opinion. There are too many factors that need to be defined. A purely factual piece would be titled something like, "Energy use by ChatGPT and it's effects on the environment." As for apples and oranges, yes. I'm not saying his numbers are incorrect, but I am saying it's important to know what it is that I'm asking ChatGPT to do, as different types of tasks require vastly different amounts of energy https://www.bernstein.com/our-insights/insights/2023/articles/ais-next-big-challenge.html For this reason, I don't think this is quite the best article to use for a fact-based discussion. (INSERT LINE BREAK) If you want MY opinion, which is TOTALLY my opinion, I really can't understand why people have gotten so obsessed over the energy use of this particular search engine. There are SO many ways that we waste energy, and I find energy waste achingly painful. When we look at the big picture, energy use due to AI is expected to rise 20% by 2030. https://www.cnbc.com/2024/05/05/ai-could-drive-natural-gas-boom-as-utilities-face-surging-electric-demand.html
This is a huge amount! (investor's note: invest in natural gas!) (environmentalist's note: protest hydro-fracking!) (human note: aaargh!) I think that the main reason why we, as a society, are so worried about the energy use by AI search engines is that so many of us are unaware of the AI integrated into virtually every aspect of our modernized society. It's a chore just to figure out how to turn off the AI search on Google. AI is absolutely everywhere, and we cannot avoid it. AI may or may not be utilized in other tasks on the chart. How do we even know when we are using AI? (INSERT LINE BREAK)
Does this 20% increase in energy use disturb me? Undoubtedly. Do I think I can change it by refusing to use AI search engines? Not personally, but I do my best anyhow. I think there are better things worth arguing about. I mean, why is cryptocurrency even legal? (argumentative tactic: changing the focus of the argument) Yikes! Talk about massive energy use that provides absolutely no social benefit. AI search engines can both stand in as psychologists and perform mundane tasks of aligning criteria between diverse topics. Like prescription drugs, social media, guns and hammers, AI is a controversial tool that can be overused, mis-used, or used in an *appropriate* manner to achieve a goal. (INSERT LINE BREAK) IMO, what we need to think about as a society (if we can think as a society) is: How can we most efficiently use this new tool to achieve our goals? When should it be applied? When should it be banned? I have the impression that's probably part of what you're trying to figure out, in terms of education. It's a necessary question.
A back of the envelope calculation to give another estimate for the LLM consumption including training:
Lets assume the H100 gpu represents most of the AI companies' compute.
Each is 700 watt max TDP, so (x8760 hour in the year) 6 million watt hours per year. Times 2 millions H100 sold in 2024 worldwide --> 12000 gigawatt hour.
(The GPUs are used 100% all the time, either for inferring or for training new models).
OpenAI has 500 millions "users" (couldn't find good sources for those, but that number keeps coming up). It's pretty likely that most people using LLM must have made an openAI account at one point.
So 12000gigawatthour divided by 500million users of LLM so 24000watt hour per user.
That's about equal to running a kitchen oven for 15 hours. Which is much more than I expected, but is still negligable when compared to ordering on amazon, eating meat or driving a car.
LLM has plenty of problems, being bad for the environment isn't one of them (yet).
I'll consider limiting my LLM use (which i do enjoy) if we've considered stopping those much worse other things before.
Another good source, mostly aligned with Masley's perspective: https://www.sustainabilitybynumbers.com/p/ai-energy-demand
Late to this one. Here's a thought.
If I want to cut a tree down, chop it into pieces, and build a bonfire as a form of evening leisure, I could reasonably say "there are 3 trillion trees on earth! Each time I do this I'm cutting down 0.0000000001% of the trees on earth, my actions are totally inconsequential."
If every human wants to do exactly that every night, all of the trees on earth will be gone in a little over a year.
To me, quantifying an individual's energy use per query or whatever is a distraction from the fact that this technology has become very popular very quickly, and the cumulative impact of that popularity has and will continue to have a very large effect on our energy use as a species.
And to me, this is different from, i.e. using water for indoor plumbing (a clear public health and quality of life benefit) or using energy to do any number of other things. We could get rid of LLMs tomorrow and it wouldn't have a large effect on the well-being of the vast majority of humans on earth. I can't think of another technology with a similar total energy / total benefit to humans ratio.
I am honestly not very concerned about per query effects. I suspect (though admittedly need to dig up better data) that the biggest "expense" is in both the building the physical infrastructures and training the actual models (including all the ones created but not actually deployed, many versions of each, etc). Ignoring this, sure, queries may not be that big a deal. But that's a huge piece to ignore (and as with many of the world's biggest problems, not something a single individual's behavior will affect much at this point).
The proof of the pudding is in the eating, and the corporations platforming the AI pudding we are eating, not just companies like ChatGPT but Google and Microsoft, are reporting a huge impact on their ability to reach their sustainability goals because of AI. Google and Microsoft were on pace to reach their 2030 net zero emissions goals and now they're not and they both cite AI as the main culprit. Could they be lying or exaggerating? It's possible.
This seems like a weird take to me, because asking the question is supposed to be the least energy intensive part as far as I knew. I thought the most energy intensive part was training the AI before it ever gets to questions. So it seems like it becomes more energy efficient the more questions we ask after training.
No citation, sorry. I'd be curious if someone has the numbers.