Personalized Learning continues to gather steam as a way to privatize and standardize education. And for educating yourself about the movement and what it really means, I'm here to recommend the latest edition of Jennifer Berkshire and Jack Schneider's podcast, Have You Heard. Berkshire and Schneider talk to guest Bill Fitzgerald about some of the more important aspects of this magical "algorithmically mediated learning."
Schneider, an actual education historian, puts this new tech-based movement echoes similar revolutionary predictions about television, movies and, yes, radio.
Fitzgerald addresses the problem of multiple interpretations of personalized learning, focusing on the issues of the tech-centered model currently prevalent.
Yeah, so I mean I think that there is not a single definition of personalized learning that satisfies everybody who actually works in education technology. When we talk about algorithmically mediated learning, we're talking about a very specific type of interaction, which even though it's actually called personalized learning, actually cuts people out of the project, cuts people out of the equation. So we have the semantics of the term, which actually sound very human, but in some implementations, not all, but in some, we actually have a process where what we call personalized learning is actually less personal and less human.
The conversation deals with the related issue of assessment, because as always, in bad teaching design, assessment drives everything else. And that's not the only important related issue. Here's Fitzgerald again:
So when we talk about our education system and one size fits all, I think we need to acknowledge that there's a context within which our education system has never worked in an equitable way. So when we talk about one size fits all, I think we need to be very careful about how we're defining all.
Schneider on the allure of technology:
The challenge of desegregating schools is a political and moral problem, not a technical problem and part of the allure of technology is that it suggests that whatever is holding our schools back in terms of delivering an equally excellent education for all children is simply a matter of a technological fix, that it's not going to require us to make a difficult trade off, that it does not pose a dilemma that is unsolvable and that will require us to collectively make a decision about what we value and for some of us to give things up.
Which brings to one of Fitzgerald's most important insights:
Yeah, we have these social and ethical and moral issues and algorithms can effectively embeds those and make them less visible and because it's an automated process, we were trained to think that it's more objective, when the reality is its lack of objectivity just gets done automatically every single time. So I think there's some large scale misunderstanding of actually what algorithms do and how they can invent biases and we have examples of this all over the place.
It's a valuable and informative podcast. Take the time to give a listen: