It will be seamless and swift. AI will develop syllabi and lesson plans. AI will design and assign all the work to be completed. Then AI will complete all the assignments and send them to AI for assessment. (AI can then send personalized assignments to address the AI's weak areas, but it probably won't have to).
All the teachers will be fired. All the students will stay home. Building repairs will be unnecessary as long as the computer hub at its heart is preserved.
Leaders and Ed Tech companies will survey the empty building, buzzing with electricity whizzing up and down the wiring in the hollow walls, and congratulate themselves on its modern efficiency.
The school year will last about a half an hour, depending on how many AI are enrolled.
You can say that this is extreme hyperbole, that of course things will never progress this far. My question is then, where will the line be drawn? At what point will Important People step up and say, "This has gone far enough."
At what point will Important People say that we can't remove any more human element from the process.
Maybe at this point we're just too overwhelmed by the gee whizzakers of it all, like the guy who showed up on Bluesky "So excited to publicly launch All Day TA," a teaching assistant that would work 24/7 and coincidentally free a college from having to hire one more live human.
Maybe some of us are just so amazed that we aren't ready to ask questions like "What problem is this supposed to solve" or "Does it actually solve that problem" or even "Are the costs worth the results?"
I can remember the days decades ago when my students discovered personal computers and printers. They were so amazed that they could print their work in any font in any size in any color that they absolutely never stopped to ask if printing their paper in, say, 8 point French Scrip rendered in yellow ink, might not be a great choice.
That's the initial moment of technological exuberance--so excited you can do it that you don't stop to ask if you should.
For the current AI irrational exuberance, add-- so excited at what you've been promised you can do that you don't stop to check if you can really do it.
As with the pandemic, we are being challenged to think about what, exactly, we think the point of education and schools is supposed to be and make deliberate choices to build schools around that vision and not some higgledy piggledy attempt to incorporate every shiny thing that attracts our attention, whether it furthers the actual purpose of school or not (and whether it can deliver its promised product or not).
Too many AI-in-education seem to think that the whole purpose of school is to produce and assess school work, resulting in grades that lead to a credential, and if you think the purpose of school is to crank out these various products, then sure--computerizing these processes makes perfect sense.
But if you think the purpose of education is something like helping each individual human being become their best self, to be fully themselves, to grasp what it means to be fully human in the world-- well, then, we need at a minimum to remember that it is AI, and not the humans in the loop, that is the tool.