- Joined
- May 4, 2022
- Messages
- 1,344
- Reaction score
- 3,341
- Location
- Three Pines
- Website
- elaineburnes.com
This isn’t about the quality of AI or its use, but the environmental impact. In case it needs to move to another section.
Publishing guru Jane Friedman asks the question, What is the environmental cost of AI?
She links to a Substack that looks only at the cost of ChatGPT. That blogger, Andy Masely, links to an MIT article and another blogger (who seems to have more bona fides that he does).
It’s…a lot to slog through and I find myself drawn down this rabbit hole even though I should be editing my WIP.
Andy Masely’s post: “Why using ChatGPT is not bad for the environment - a cheat sheet”
Jane Friedman describes him as a “former physics teacher.” I couldn’t find any other description that would give him bona fides on this topic. This is a second post; he links to a longer post, but I didn’t go there.
This post has a footnote that pretty much lost it for me (trusting his argument):
He’s looking at our individual footprint of a ChatGPT prompt. How bad can it be when it’s a fraction of the energy an individual uses overall? So don’t sweat it, he concludes. There are bigger things to worry about.
I always get my hackles up when people say things like that. I worked for a trail conservation organization and we tried to convince hikers that cutting corners, going off trail, is a bad thing. So to me Masely’s argument is like saying, well, the square inches of my boot soles are X and displace Y amount of dirt with each step, which really is nothing compared to a one-hour downpour that would erode Z amount of dirt. Fuck that. My argument is, sure, your boots may not cause damage (though in some cases, one step onto an endangered plant could render it extinct), but 7 million people visit the White Mountains every year. What if 7 million people did what you did? Honestly, to me, this is the whole problem of humanity, or at least Americans.
Masely mentions MIT Review’s article, “We did the math on AI’s energy footprint. Here’s the story you haven’t heard.”
Bottom line (literally):
Masely follows up with “Reactions to MIT Technology Review's report on AI and the environment.” And disagrees with the above conclusion.
Basically, he wants to parse one kind of AI over another—text prompts and images use less energy than a video, so skip the video and you can feel good about AI. He says, “All my posts have been about why your individual chatbot use is not harming the climate.” Is he aware of how many people there are on the planet and how many are using AI? (Whether they know it or not.)
I mean, sure, there are bigger problems facing the climate, but to dismiss AI use by an individual because it’s a drop in the bucket is not helping. Every asshole on the highway thinks it’s just fine to go 20 miles over the speed limit. They don’t care that they’re wasting gas (except those Tesla drivers, of course. They aren’t assholes wasting gas, they’re assholes endangering every other driver out there.)
Without getting through all of it, I’ll lead with my bias that I trust MIT Review and its sources more than “former physics teacher.”
All of these articles admit energy use of AI is a guessing game because AI companies don’t feel the need to tell us what energy they are using. So we have to find other ways, such as what infrastructure are they building and how are they powering it?
For example, “Elon Musk’s xAI powering its facility in Memphis with ‘illegal’ generators.”
“Google, Elementl back 1800 MW nuclear power project for data centers in US”
“Preservation Virginia lists historic battlefields among endangered sites thanks to data centers”
And water usage?
“Voices: Data centers must be transparent about water usage — for the sake of the Great Salt Lake”
You get the idea. These kinds of arguments always lead me back to Hope Jahren, “Use less, share more.”
Publishing guru Jane Friedman asks the question, What is the environmental cost of AI?
She links to a Substack that looks only at the cost of ChatGPT. That blogger, Andy Masely, links to an MIT article and another blogger (who seems to have more bona fides that he does).
It’s…a lot to slog through and I find myself drawn down this rabbit hole even though I should be editing my WIP.
Andy Masely’s post: “Why using ChatGPT is not bad for the environment - a cheat sheet”
Jane Friedman describes him as a “former physics teacher.” I couldn’t find any other description that would give him bona fides on this topic. This is a second post; he links to a longer post, but I didn’t go there.
This post has a footnote that pretty much lost it for me (trusting his argument):
I don’t think that should be a footnote.To be 100% clear, the broader climate, energy, and water impacts of AI are very real and worth worrying about. Some readers have jumped from my title to say “He thinks AI isn’t an environmental problem? This is propaganda. AI is a massive growing part of our energy grid.” This post is not meant to debunk climate concerns about AI. It’s only meant to debunk climate concerns about chatbot use (and, as I note in the intro, image generation).
He’s looking at our individual footprint of a ChatGPT prompt. How bad can it be when it’s a fraction of the energy an individual uses overall? So don’t sweat it, he concludes. There are bigger things to worry about.
I always get my hackles up when people say things like that. I worked for a trail conservation organization and we tried to convince hikers that cutting corners, going off trail, is a bad thing. So to me Masely’s argument is like saying, well, the square inches of my boot soles are X and displace Y amount of dirt with each step, which really is nothing compared to a one-hour downpour that would erode Z amount of dirt. Fuck that. My argument is, sure, your boots may not cause damage (though in some cases, one step onto an endangered plant could render it extinct), but 7 million people visit the White Mountains every year. What if 7 million people did what you did? Honestly, to me, this is the whole problem of humanity, or at least Americans.
Masely mentions MIT Review’s article, “We did the math on AI’s energy footprint. Here’s the story you haven’t heard.”
Bottom line (literally):
When you ask an AI model to write you a joke or generate a video of a puppy, that query comes with a small but measurable energy toll and an associated amount of emissions spewed into the atmosphere. Given that each individual request often uses less energy than running a kitchen appliance for a few moments, it may seem insignificant.
But as more of us turn to AI tools, these impacts start to add up. And increasingly, you don’t need to go looking to use AI: It’s being integrated into every corner of our digital lives.
Crucially, there’s a lot we don’t know; tech giants are largely keeping quiet about the details. But to judge from our estimates, it’s clear that AI is a force reshaping not just technology but the power grid and the world around us.
Masely follows up with “Reactions to MIT Technology Review's report on AI and the environment.” And disagrees with the above conclusion.
Basically, he wants to parse one kind of AI over another—text prompts and images use less energy than a video, so skip the video and you can feel good about AI. He says, “All my posts have been about why your individual chatbot use is not harming the climate.” Is he aware of how many people there are on the planet and how many are using AI? (Whether they know it or not.)
I mean, sure, there are bigger problems facing the climate, but to dismiss AI use by an individual because it’s a drop in the bucket is not helping. Every asshole on the highway thinks it’s just fine to go 20 miles over the speed limit. They don’t care that they’re wasting gas (except those Tesla drivers, of course. They aren’t assholes wasting gas, they’re assholes endangering every other driver out there.)
Without getting through all of it, I’ll lead with my bias that I trust MIT Review and its sources more than “former physics teacher.”
All of these articles admit energy use of AI is a guessing game because AI companies don’t feel the need to tell us what energy they are using. So we have to find other ways, such as what infrastructure are they building and how are they powering it?
For example, “Elon Musk’s xAI powering its facility in Memphis with ‘illegal’ generators.”
It’s been known that xAI, Elon Musk’s artificial intelligence company, has been using around 15 portable generators to help power its massive supercomputer in Memphis without yet securing permits. But new aerial images obtained by the Southern Environmental Law Center show that number is now far higher. The group says these gas turbines combined can generate around 420MW of electricity, enough to power an entire city.
…
Within one to two miles of xAI are several residential neighborhoods, where the people who live there have long dealt with industrial pollution. This area is historically Black and has higher rates of cancer and asthma and a lower life expectancy than other parts of the city.
“Google, Elementl back 1800 MW nuclear power project for data centers in US”
“Nuclear energy can provide around-the-clock abundant and reliable electricity, making it an attractive solution to meet rising energy demand from AI and data centers,” said US DOE in a press release.
“Preservation Virginia lists historic battlefields among endangered sites thanks to data centers”
Manassas National Battlefield Park in Prince William County and Wilderness Battlefied in Orange County are among 11 sites identified in this year’s report. Both are located near large-scale data center projects that have already been approved, including the 2,100-acre Prince William Digital Gateway in western Prince William.
…
Other historic sites in Prince William County have also been affected by data center-related construction. Grading work for a new Iron Mountain data center and a Northern Virginia Electric Cooperative substation has reportedly disturbed two historic Black cemeteries in Gainesville. Preservation groups and descendants raised concerns after grave markers were displaced and questioned whether proper surveys had been conducted.
And water usage?
“Voices: Data centers must be transparent about water usage — for the sake of the Great Salt Lake”
Cheap water? In Utah? Where the Great Salt Lake is dying? I mean…The data center is just one of roughly two dozen in the Great Salt Lake Basin, which has become a popular area to build data centers due to the region’s cheap water and business-friendly policies.
Regulators currently do not understand how much water is being used in this sector, making it almost impossible to set conservation targets and policies to encourage data centers run by private companies and government agencies to reduce water usage.
You get the idea. These kinds of arguments always lead me back to Hope Jahren, “Use less, share more.”