Pages

Thursday, November 27, 2025

Warding Off Classroom AI

There's a lot out there from folks trying and suggesting and selling ways for teachers to put their fingers in the dike holding back the allegedly inevitable AI tide. 

But I think playing AI whack-a-mole with computerized detector bots and policies designed for the express purpose of curbing chatbot cheating are not the way to go. Simply forbidding it is as effective as was the banning of Cliff Notes or Wikipedia. Numerous bots claim they can catch other bots in action; I am unconvinced and too many students have been unfairly and incorrectly accused. Trying to chase the chatbots away is simply not going to work. More than that, it is not going to help students grasp an education.

Cheating has always had its roots in a few simple factors. Students believe that success in class will be either too hard or too time-consuming for them. Students believe the stakes are too high to take a chance on failure. And students do not have a sense of the actual point of education.

I usually explain The Point like this-- education is the work of helping young humans figure out how to be more fully their best selves while working out what it means to be fully human in the world. That's a big soup with a lot of ingredients (some academic and some not), and the required ingredients vary from person to person. 

Because it's human.

As I've now said many times, AI most easily rushes into places where humanity has already been hollowed out. And unfortunately, too often that includes certain classrooms.

We've had chances to work on this before. Nancy Flanagan (and many others) tried hard to bring some attention to using the pandemic to reset schools into something better than either tradition or reform had created. But everyone (especially those in the testing industry) wanted to get back to "normal," and so we passed up that opportunity to reconfigure education. And so now here we are, facing yet another "threat" that is only threatening because we have created a system that is exceptionally vulnerable to AI.

Modern ed reform, with its test-centric data-driven outcome-based approach has pushed us even further toward classrooms that are product-centered rather than human-centered. But if class is all about the product, then AI can produce those artifacts far faster and more easily than human students. 

Carlo Rotella, an English professor at Boston College, published a New York Times piece that argues for more humantity in the humanities. He writes:
An A.I.-resistant English course has three main elements: pen-and-paper and oral testing; teaching the process of writing rather than just assigning papers; and greater emphasis on what happens in the classroom. Such a course, which can’t be A.I.-proof because that would mean students do no writing or reading except under a teacher’s direct supervision, also obliges us to make the case to students that it’s in their self-interest to do their own work.

Yup. Those the same things that I used to make my high school English class cheat-resistant for decades. Writing in particular needs to be portrayed as a basic human activity a fundamental function with lifetime utility. 

In education, it's important to understand your foundational purpose. It is so easy in the classroom to get bogged down in the daily millions of nuts and bolts decisions about what exactly to do-- which worksheet, what assignment, how to score the essay, which questions to ask, how to divide up the 43 instructional minutes today. Planning the details of a unit is hard--but it gets much easier if you know why you are teaching the unit in the first place. What's the point? I hate to quote what can be empty admin-speak, but knowing your why really does help you figure out your what and how.

If you have your purpose and your values in place, then you can assess every possible pedagogical choice based on how it serves that central purpose. The same thing is true of AI. If you know what purposes you intend to accomplish, you are prepared to judge what AI can or cannot contribute to that purpose. And if your purpose is to help young humans grow into their own humanity, then the utility of this week's hot AI tool can be judged.

Ed tech has always been introduced to classrooms ass-backwards-- "Here's a piece of tech I want you to use, somehow, so go figure out how you can work it in" instead of "Think about the education problems you are trying to solve and let me know if you think this piece of tech would help with any of them." 

But I digress. The key to an AI-resistant classroom is not a batch of preventative rules. The answer is to create a classroom with such a thoroughly human context, values, and purpose that AI is required to either provide something useful for that context, or is left out because it doesn't serve a useful purpose. The big bonus has nothing to do with AI, and everything to do with a more deliberately human approach to educating young human beings. 

No comments:

Post a Comment