Wendell Berry was born in 1934 and grew to be a writer across a wide number of forms, as well as working as an activist and farmer, mostly in rural Kentucky. He opposed the Vietnam War, debated Then-Secretary of Agriculture Earl Butz, and published a critique of George W. Bush's post-9/11 strategy. When he was 76 years old, he and 14 other protestors got themselves locked in the Kentucky governor's office to protest mountaintop removal coal mining (strip mining on steroids). And he's still at it, delivering hearing testimony in 2022.

Berry came up with rules for things; you may very well have seen some over the years. There are his 17 rules for a sustainable local community, and his 9 rules for consumption, but today I'm looking at his 9 rules for technology. Blogger Ted Gioia reminded me of these rules; Berry whipped them up as a response to friends who were trying to convince him that a computer would be a step up from handwritten copy typed up on a thirty-year-old typewriter ("Why I Am Not Going To Buy A Computer," 1987).
The rules have many applications, but they fit very nicely for the conversations we continue to have in education, particular the heavily-pushed AI. So let's take a look.
The new tool should be cheaper than the one it replaces.
Part of what is driving the AI love (like many innovations before it) is the dream of replacing expensive teaching professionals with something cheaper. Curriculum in a box appeals to those who want to de-professionalize education, doing for teaching what McDonald's did for cheffing.
AI promises these same folks something even more exciting-- replacing teachers with software that will be cheap and, better yet, never talk back or unionize.
Is AI really cheaper? We don't know yet; right now, AI companies are trying to conquer the market amazon-style, forgoing making money until after they've planted their flag on the education summit. But at some point they are going to want to make money. Then we'll see the real price.
Probably still cheaper than a human, but then, price paid to the company will be only part of the cost. There's the giant sucking up of electricity, and the blowing through a gazillion gallons of water to cool servers. Plus the cost of students under-educated, because while Musk and Gates can insist that AI can do a teacher's job, they make that claim only because they don't understand what a teacher does or how education works.
It should be at least as small in scale as the one it replaces.
Computers have been taking teacher ed tech in this direction for years, from the giant computer set-up of twenty years ago to the run-everything-from-a-tablet tech of today. Students, however, have been pushed in the other direction. A book, a tablet, and a pen or pencil are far more compact than a desktop, and a netbook barely competes, particularly because the netbook requires plug-in (and the school's network to be working properly).
Is AI more small scale than a human teacher? I guess they win on that one.
It should do work that is clearly and demonstrably better than the one it replaces.
Hoo boy. Enshittification has meant that even things that used to meet this start to fall behind. Is Google better than the card catalog or reference books in your library? Well, it used to be. Now if Google (or dozens of other search engines) even correctly interprets what you have asked, you must scroll past mountains of advertising and paid-for search results.
This is perhaps how AI marketeers keep hope alive, because ChatGPT can do better work than your worst teacher or your worst student (as long as it doesn't present too many flat out errors) but cannot keep up with good teachers and students.
But "do work" is performing feats of Olympic weight-lifting status here, because, yes, if you think the work is to research and write an essay, ChatGPT can mimic that task. But if you think the work is to acquire and synthesize understandings and insights, then no-- ChatGPT can't do any of those things at all, and its performance of those tasks instead of students studenting means the work wasn't done at all.
It should use less energy than the one it replaces.
Oh, no. AI is gobbling up the power supply and only getting worse and worse.
If possible, it should use some form of solar energy, such as that of the body.
As suggested above, Berry was not a fan of coal burning for generating electricity. But the shift to solar isn't happening in any large scale way, and certainly not with AI.
It should be repairable by a person of ordinary intelligence, provided that he or she has the necessary tools.
We've moved steadily backward on this one in many ways. Computerizing tech creates barriers to repairability, but companies have taken other steps.
John Deere infamously led the way by forbidding its customers to work on the tractors that they had bought with their own money. There's your annoying printer that now won't work unless you buy the company's official more-precious-than-hold ink.
AI adds another level to this problem--not even the people who work with LLM and generative AI fully understand what exactly the computer is doing, nor can they necessarily fix it-- though they do have access to ways to push the tech in one desired direction or another.
It should be purchasable and repairable as near to home as possible.
Berry was writing forty-ish years ago, so I'm not sure how he would have interpreted the ability to order and download stuff when it comes to this rule. AI can, of course, be wherever you want it to be--certainly more so than possible or desirable with a human teacher. Though use of platforms has allowed teachers to extend their "presence" to students 24/7.
It should come from a small, privately owned shop or store that will take it back for maintenance and repair.
Not happening. Wasn't happening back when Berry was writing.
It should not replace or disrupt anything good that already exists, and this includes family and community relationships.
As Gioia writes, "This may be the biggest tech failure of them all." Tech has been written to exploit creators and manipulate users deliberately and sometimes dangerously. And "disrupt" is of course one of the tech world imperatives. Why? Maybe they just want to work out long-lived anger that they didn't get to sit at the popular kids table, or maybe they feel it's their right to rule over the lesser beings whose understanding is so clearly inferior to their own.
Whatever the case, anyone who has taught for more than one week is familiar with the teacher "training" for a new solution where the undercurrent (sometimes not all that "under") is "You guys are doing it wrong and we are here to straighten you out."
"Move fast and break things" is the opposite of what Berry's ninth rule favors, but it's a beloved tech-lord mantra. It would carry a lot more heft if the "things" we were talking about weren't the parts of the system that delivers education to young humans. Berry's rules might seem a little quaint, but I don't think it would hurt us much to pay attention to them.