Friday, June 20, 2025

Should AI Make Students Care?

Over the years I have disagreed with pretty much everything that Thomas Arnett and the Christensen Institute have had to say about education (you can use the search function for the main blog to see), but Arnett's recent piece has some points worth thinking about. 

Arnett caught my attention with the headline-- "AI can personalize learning. It can’t make students care." He starts with David Yeager's book 10 to 25: The Science of Motivating Young People

Yeager challenges the prevailing view that adolescents’ seemingly irrational choices—like taking risks, ignoring consequences, or prioritizing peer approval over academics—result from underdeveloped brains. Instead, he offers a more generous—and frankly more illuminating—framing: adolescents are evolutionarily wired to seek status and respect.

As someone who worked with teenagers for 39 years, the second half of Yeager's thesis feels true. I'd argue that both ideas can be true at once-- teens want status and respect and their underdeveloped brains lead them to seek those things in dopey ways. But Arnett uses the status and respect framing to lead us down an interesting path.

[T]he key to unlocking students’ motivation, especially in adolescence, is helping them see that they have value—that they are valued by the people they care about and that they are meaningful contributors to the groups where they seek belonging. That realization has implications not just for how we understand student engagement, but for how we design schools…and why AI alone can’t get us where we need to go.

This leads to a couple of other points worth looking at.

"Motivation is social, not just internal." In other words, grit and growth mindset and positive self-image all matter, but teens are particularly motivated by how they are seen by others, particularly peers. Likewise, Arnett argues that it's a myth that self-directed learning is just for a handful of smarty-pants auto-didacts. He uses Bill Gates and Mark Zuckerberg as examples, which is interesting as they are both excellent examples of really dumb smart people, so maybe autodacting isn't all it's cracked up to be. But his point is that most students are autodidacts-- just about things like anime and Taylor Swift. And boy does that resonate (I have a couple of self-taught Pokemon scholars right here). I'll note that all these examples point to auto-didactation that results in a fairly narrow band of learning, but let's let that go for now.

Arnett follows this path to an observation about why schools are often motivational dead zones:

The problem is that school content often lacks any social payoff. It doesn’t help them feel valued or earn respect in the social contexts they care about. And so, understandably, they disengage.

And this

Schools typically offer only a few narrow paths to earn status and respect: academics, athletics, and sometimes leadership roles like Associated Student Body (ASB) or student council. If you happen to be good at one of those, great—you’re in the game. But if you’re not? You’re mainly on the sidelines.

Students want to be seen, and based on my years in the classroom, I would underline that a zillion times. 

The AI crew's fantasy is that students sitting in front a screen will be motivated because A) the adaptive technology will hit them with exactly the right material for the student and B) shiny! Arnett explains that any dreams of AI-aided motivation are doomed to failure. 

AI won't fix this

Arnett's explanation is not exactly where I expected we were headed. Human respect is scarce, he argues, because humans only have so much time and attention to parcel out, and so it's valuable. AI has infinite attention resources, can be programed to be always there and always supportive. Arnett argues that makes its feedback worthless in terms of status and respect. 

I'm not sure we have to think that hard about it. Teens want status and respect, especially from their peers. The bot running their screen is neither a peer, not even an actual human. It cannot confer status or respect on the student, nor is it part of the larger social network of peers. 

Arnett argues that this might explain the 5% problem-- the software that works for a few students, in part because 95% of students do not use the software as recommended. Because why would they? The novelty wears off quickly, and truly, entertainment apps don't do much better. I don't know what the industry figures say, but my anecdotal observation was that a new app went from "Have you seen this cool thing!" to "That old thing? I haven't used it in a while" in less than a month, tops. 

What keeps students coming back, I believe, isn’t just better software. It’s the social context around the learning. If students saw working hard in these programs as something that earned them status and respect—something that made them matter in the eyes of their peers, teachers, and parents—I think we’d see far more students using the software at levels that accelerate their achievement. Yet I suspect many teachers are disinclined to make software usage a major mechanism for conferring status and respect in their classrooms because encouraging more screen time doesn’t feel like real teaching.

From there, Arnett is back to the kind of baloney that I've criticized for years. He argues that increasing student motivation is super-important, and, okay, I expect the sun rise in the East tomorrow. But he points to MacKenzie Price's Alpha School, the Texas-based scam that promises two hour learning, and Khan Academy as examples of super-duper motivation, using their own company's highly inflated results as proof. And he compares software to "high dosage tutoring," which isn't really a thing.

Arnett has always been an edtech booster, and he's working hard here to get the end of a fairly logical path to somehow provide hope for the AI edtech market. 

But I think much of what he says here is valuable and valid-- that AI faces a major hurdle in classrooms because it offers no social relationship component, little opportunity to provide students with status or respect. Will folks come up with ways to use AI tools that have those dimensions? No doubt. But the heart of Arnett's argument is an explanation of one more reason that sitting a student in front of an AI-run screen is not a viable future for education. 


No comments:

Post a Comment