Tuesday, January 20, 2026

Setting an ExAImple

Micah Blachman is a twelve-year-old seventh grade tech blogger, and in a couple of posts he offered pocket-sized reviews of some of the tech products he encounters in school. The posts are worth a read, but I want to focus on one particular paragraph:

AI. This one is a bit of the odd one out. As students, we are forbidden from using AI for school-related purposes. However, I see my teachers using it to create assignments, mistake-filled example essays, lesson plans, and class discussion questions. some more than others. It makes me wonder: why am I spending so much time doing this assignment that was obviously created by ChatGPT or Claude (there’s literally a tab with a ChatGPT icon in the teacher’s browser!)? I wouldn’t say that AI is necessarily bad at creating assignments. But there’s sometimes factually incorrect things, or questions that don’t make sense, or analysis that feels far-fetched in a class discussion. If the students can’t use AI, why is there a double standard for the teachers?

If you are someone who spends time around young humans-- parent, teacher, camp counselor, etc-- I think it's good practice to ask yourself regularly, "What do I want them to see me doing?"  Because what they see you doing generally has at least as much influence as what you say, and often more. 

If you are in a classroom saying, "Don't use AI to do your work" to your students while simultaneously using AI to do your work, you are, to use an earlier age's terminology, setting a bad example. Or, if you prefer something more contemporary, you are full of it. I cannot think of any argument you can use to forbid student use that does not also apply to teacher use, or conversely, any argument in favor of teacher use that students could not also use. I suppose you could go with, "It's important for students to show their own individual work that comes from their own brain, but not the classroom teacher," but be aware that's also a good argument for replacing you with ChatGPT and a box of old lesson plans.

But beyond the hypocrisy problem, there's another bad message embedded in this kind of behavior. It says, "Using chatbots to do your work is what adults do in the real world, but you aren't an adult in the real world." In other words, students, what you do for class has nothing to do with the real world. It's just school stuff. In which case, why shouldn't they cheat by whatever means is handy?

If your argument against student use of AI is that it is cheating, then don't cheat. If your argument is that it's a grownup tool that requires certain knowledge and care, then teach them the necessary knowledge and care. I'd rather nobody in your classroom touched it ever, but I recognize that some folks are wrong disagree with me on this. 

But if you are in a classroom like Blachman's, do not kid yourself that the students haven't noticed there's an ethical problem here. Also, do not kid yourself that the students haven't noticed that your AI materials are not particularly great. 

Lord knows, I'm aware that teaching comes with a massive cognitive load and a tremendous under-supply of time. But the choices you make as a teacher are part of your influence. Your students are carrying a cognitive load and the challenge of finite time, too, and you are modeling how to deal with those burdens. The issue is not new; out there somewhere are the teachers who got their literature lesson plans from Cliff Notes or Dr. Google and whose students figured out that the whole class was just a game where you look for the easy button. If your model is "find a way to offload your mental load to a bot," do not imagine for a moment that your students do not see you


No comments:

Post a Comment