Let's imagine that rather than instead of typing on a keyboard in the classroom, pulling up answers effortlessly from some unseen, students had to do something else.
To get to the source of their "assistance" teachers had to load them, just a few at a time, into large three-miles-to-the-gallon coach bus that would take them fifty miles to the "assistant." Along the way, the bus would pass over a major body of water, where it would dump the contents of the rolling indoor outhouse into that body of water. All so that a few students could get some help with a writing assignments or math instruction or just plain have someone do the assignments for them.
There's been much written about the intellectual, pedagogical, artistic, and philosophical issues of generative AI, which is all important when considering the mental impact of AI.
But maybe we should spend some more time talking about the actual physical impact on the world.
The amount of electricity used to power generative AI is literally incomprehensible. Researchers estimated that creating little old GPT-3 consumed 1,287 megawatt hours of electricity and generated 552 tons of carbon dioxide equivalent, the equivalent of 123 gasoline-powered passenger vehicles driven for one year. And that's just to set it up, before users actually started getting it to do its thing. Or before its keepers give it its latest update. Or consider this from an article published just last year at Scientific American:
But a peer-reviewed analysis published this week in Joule is one of the first to quantify the demand that is quickly materializing. A continuation of the current trends in AI capacity and adoption are set to lead to NVIDIA shipping 1.5 million AI server units per year by 2027. These 1.5 million servers, running at full capacity, would consume at least 85.4 terawatt-hours of electricity annually—more than what many small countries use in a year, according to the new assessment.
It's remarkable how few specifics are out there. The training phase and the asking-it-to-answer-a-prompt phase don't take the same amount, but how they compare seems fuzzy (writers seem to feel that the asking for response stage uses more). Besides sucking up electricity, which is not an infinite resource, that sucking has implications for the gases generated by meeting the need to produce more power. According to NPR, Google says its greenhouse gas emissions climbed nearly 50% over 5 years primarily because of AI data centers.
Here's a chart from Earth.org that provides a little perspective:
That bar for AI (the way taller than any other) represents only the training phase. If cars and people flying in private jets bothers you. generative AI should positively freak you right out.
Jesse Dodge, research analyst for Allen Institute for AI (founded by Paul Allen, so not tech haters), told NPR that a single query will use the electricity that could light one bulb for twenty minutes, which doesn't seem like a lot until you multiply it by a million times a day. That is way more than, say, a typical search--though of course tech companies have baked their AI into search functions, so you're generating an AI prompt all the time whether you want to or not. Some researchers advocate for solar power, but that doesn't solve all the problems.
It's not just the electricity and the carbon footprint. Data centers require huge amounts of water to keep cool. Cindy Gordon writing for Forbes says that the centers consume "significant" water, evaporating about 9 liters of water per kWh of energy used. AI's projected water usage, says Gordon, could hit 6.6 billion cubic meters by 2027. That's on top of the water "withdrawn" for hydroelectric generation of the power that AI needs.
Right now, all of this is kept behind a curtain, out of view of the average AI user. But if we are going to use computer magic to answer prompts like "Write me a five page paper about Hamlet" or "Whip up my lesson plans next week," we really ought to understand the cost.
It's not just that generative AI doesn't produce magic results--it doesn't use magical techniques to get those results, either.
No comments:
Post a Comment