However, just like any tool and technology, AI can be used as a force for good in education.
Yessir-- any tool and technology can be a tool for good in education. Staplers, rocket boosters, chicken de-featherers, aglets, AK-47's-- all forces for good in education. Thanks for that insight, "Carl."
But at the top of our ed tech Krapatoa, we must make room for pieces like this one-- "Could Elon Musk’s AI Robots Save A Troubled Education System?"
Classrooms where routine tasks are handled by a humanoid robot could soon be a reality. With 44% of K-12 teachers in the U.S. feeling burned out “often” or “always,” advanced AI robots could offer much-needed support.
This is one of many reactions to Tesla's "We, Robot" robopallooza which featured the humanoid robot Optimus.
These robots, along with similar technologies, have the potential to integrate into various aspects of daily life, including educational settings, potentially revolutionizing how we approach tasks and combat issues such as teacher burnout.
The robot will cost between $20K and $30K, says Musk. "It'll basically do anything you want. So it can be a teacher or babysit your kids. It can walk your dog, mow your lawn, get the groceries, just be your friend, serve drinks, whatever you can think of, it will do." Sure. Because mowing the lawn and teaching a human child are pretty much in the same class of activities. Writer Dan Fitzpatrick shows how little somebody (maybe himself, maybe Tesla's copy writer, maybe Musk himself) understands about teaching.
Could Optimus change how classrooms operate? As a teaching assistant, it could handle tasks like preparing materials and supervising students during activities. This could reduce the administrative burden on teachers, allowing them to engage more with students. In special needs education, Optimus could provide personalized instruction and physical assistance, improving the learning experience for students requiring extra support.
Yes, because if there's anything that would be easy to script AI programming for, it's teaching students with special needs. There is no reason to think that LLM have cracked the code of authoring teaching materials, nor is there any reason to think that housing an LLM in a human-ish body would somehow improve that capability and not add a whole other series of potential failure points to the tech. But I admit to wishing just a little bit that I could watch all the hilarity and chaos that would come from a Tesla robot trying to supervise a roomful of students, even the ones too young to consciously conclude "If no human being thinks I'm important enough to stay here and work with me, then why should I bother to work-- or behave-- at all?"
Fitzpatrick says he knows of a school that has the stack of money set aside to spend on this monstrosity should it ever appear on the market. Fitzpatrick also says that it's "important to note that the actual implementation of Optimus in classrooms is still theoretical." Rather like Tesla's self-driving cars.
You may remember that the last time Musk trotted out a "robot," it was a guy in a suit (as hilariously lampooned by John Oliver). The staged roll-out of Optimus has many folks saying it sure looks as if the robots are just remote-controlled cyber puppets.
Look, sometimes Musk's people do good work; the space-geek child within me is pretty squeed out that Space X managed to catch a booster rocket. But that strikes me as a hell of a lot simpler than programming Robby the Robot to teach American literature to sixteen year olds, let alone perform an imitation of human movement while doing it. It speaks to one of the most common issues of ed tech-- it's created by people who have absolutely no clue about what teaching actually involves. And this idea layers on other questions-- people mostly hated learning in isolation via screens during COVID, but would they like it better if the screen read itself to you and get up and walk around at the same time?
It's the modern ed tech pitch. Instead of "Here's a thing that the tech can actually do, right now, that will help you do your job" or even the old "Here's some technology that will help you do your job, if you will just go ahead and change the way your job is done," this pitch is the hard sell-- "This technology is inevitable, so you might as well get on board now."
It's a display of childlike faith that tech execs still think that the threat of inevitability still carries weight. Musk alone has a long list of unmet promises ("Autonomous robotaxis for Tesla next year" he said, in 2019).
Fitzpatrick, however much he salts his cheerleading with "could" and "potentially," is in need of some serious restraint in his speculation.
These robots could significantly alter how we educate our children in the next decade or two. They could support teachers and provide personalized learning experiences, potentially leading to higher success rates and improved student well-being. Learning to live and work in collaboration with such technology will be an essential skill and introducing this at school could better prepare younger generations.
Sure. They could also perform surgery and pilot jumbo jet liners and become sandwich artists at Subway. But maybe, possibly, those highly complex activities will turn out to be too much for them, and also, why? Other than the possible chance for businesses to save money by firing humans, what problems does this solve? Also, "could better prepare younger generations" for what? Having to work with software than other human beings? Is too much human interaction some sort of problem that needs to be solved? Having reached a new near-consensus that smartphones should be kept out of the classroom, should we now insert a smartphone that can walk and talk?
Educators, policymakers and tech developers need to collaborate to thoughtfully integrate robots like Optimus into educational frameworks. As this technology advances, the question remains: Are we ready to embrace a classroom where robots and teachers work together to inspire the next generation? What do you think about the potential of robots like Optimus in education? Are we ready for this next technological leap in our classrooms?
Educators, policymakers and tech developers do not need to "collaborate." Tech developers need to stand back, shut up, and try to learn about teaching and see if they can identify some problems actual teachers actually have that tech could actually solve instead of showing up with their favorite solution and demanding that educators "collaborate" with them to help them crack open a market for their product. What me to "embrace" a human-robot classroom partnership? Then tell me what problems in education a robot would solve other than the problem of "How do we sell more of these robots?"
Of all the promises one could make about one's pretend hypothetical AI robot, why bring up education? What is it about education that is constantly leading amateurs to imagine that teaching is a simple process, easily reduced to simple algorithm-friendly steps and measurements? As long as there have been schools, there has been a steady parade of people who are sure they have invented--or at least come up with the concept of a plan-- a device that will revolutionize education by solving problems that they imagine need to be solved.
What do I think about the potential of robots like Optimus in education? First, I think that so far even Optimus is not a robot like Optimus. It's an idea, unrealized and poorly defined and lacking any specific capabilities that could be useful. The idea itself seems to me like a nightmarish, expensive, non-useful idea, but maybe when there's an actual robot that actually does teaching stuff, we can renew this conversation.
In the meantime, if the tech sector could create a printer that works the ways it's supposed to even 95% of the time, that would be way more classroom help than an imaginary robot.
I'm the lead mentor of my high school's FIRST Robotics team. We build competition robots that are about the mass and size of an Optimus. Our programmers do actually have them perform a number of reasonably complex autonomous or semi-autonomous actions on the field while also guiding them from the end of the field part of the time. And we would never allow a human to be on the field while it's operating. Nor would we expect of it independent and/or creative thought (or, indeed, any thought at all) or action. Robots are fantastic machines that can do many things for us, as a well-programmed assembly robot can demonstrate, but they are not a substitute for a human and are simply incapable of doing a vast range of things that a human does, sometimes even without conscious thought.
ReplyDeleteMy favorite parts (but there are more):
ReplyDeleteTech developers need to stand back, shut up, and try to learn about teaching and see if they can identify some problems actual teachers actually have that tech could actually solve instead of showing up with their favorite solution and demanding that educators "collaborate" with them to help them crack open a market for their product.
"the concept of a plan"
It's an idea, unrealized and poorly defined and lacking any specific capabilities that could be useful. The idea itself seems to me like a nightmarish, expensive, non-useful idea.
If the tech sector could create a printer that works the ways it's supposed to even 95% of the time, that would be way more classroom help than an imaginary robot.
Rebecca deCoca
And all of the money that goes into this tech crap is money that can't be spent of actual human beings to help actual human students. While my class sizes are an average of 36 9th graders in a core class
ReplyDeleteThis kind of thing always reminds me of the great "trouble here in River City!" song in Music Man, with some tech grifter as Harold Hill, and some flustered administrators as the crowd waving their hands in panic about test scores, the need to be Up To Date, etc.
ReplyDelete