Wednesday, December 5, 2018

Real Stupid Artificial Intelligence (Personalized Learning's Missing Link)

Good lord in heaven.

Intel would like a piece of the hot new world of Personalized [sic] Learning, and they think they have an awesome AI to help. And they have concocted a deliberately misleading video to promote it.

In the video, we see a live human teacher in a classroom full of live humans, all of whom are being monitored by some machine algorithms "that detect student emotions and behaviors" and they do it in real time. Now teachers may reply, "Well, yes, I've been doing that for years, using a technique called Using My Eyeballs, My Ears, and My Brain." But apparently teachers should not waste time looking at students when they can instead monitor a screen. And then intervene in "real time," because of course most teachers take hours to figure out that Chris looked confused by the classwork and a few days to respond to that confusion.

Oh, the stupid. It hurts.

First, of course, the machine algorithm (copywriters will be damned if they're going to write anything like "students will be monitored by computers") cannot detect student emotions. They absolutely cannot. They are programmed to use certain observable behaviors as proxies for emotions and engagement. How will Intel measure such things? We'll get there in a second. But we've already seen one version of this sort of mind-reading from NWEA, the MAP test folks, who now claim they can measure engagement simply by measuring how long it takes a student to answer a question on their tests. Because computers are magical!

Turn it around this way-- if you had actually figured out the secret of reading minds and measuring emotions just by looking at people, would your first step be to get in on the educational software biz?

In fact, Intel's algorithm looks suspiciously unimpressive. They're going to measure engagement with three primary inputs-- appearance, interaction and time to action. A camera will monitor "facial landmarks," shoulders, posture. "Interaction" actually refers to how the student interacts with input devices. And time to action is the same measurement that NWEA is using-- how long do they wait to type. Amazing, And please notice-- this means hours and hours of facial recognition monitoring and recording.

Intel is ready to back all this up with some impressive argle-bargle:

Computers in the classroom have traditionally been seen as merely tools that process inputs provided by the users. A convergence of overlapping technology is making new usages possible. Intel and partners are enabling artificial intelligence at the edge, using the computing power of Intel CPUs to support artificial intelligence innovations with deep learning capabilities that can now know users at a higher level – not merely interpreting user commands but also understanding user behaviors and emotions. The new vision is a classroom PC that collects multiple points of input and output, providing analytics in real-time that lets teachers understand student engagement.

This just sounds so much more involved and deep than "the computer will watch how they hold their lips and tell the teacher what the algorithm says that lip position means."

Who is the market for this? I want to meet the teacher who says, "Yeah, looking at the students is just too challenging. I would love to have a software program that looked at them for me so I could just keep my eyes on my screen." Who the hell is that teacher, standing in front of a classroom looking not at her students, but at her tablet? Who is the administrator saying, "Yes, the most pressing need we have is a system to help teachers look at students."

Of course, there are applications I can think of for this tech

One would be a classroom with too many students for a teacher to actually keep eyes on. Monitoring a class of 150 is hard for a human (though not impossible-- ask a band director) but easy for a bank of cameras linked to some software. Another would be a classroom without an actual teacher in it, but just a technician there to monitor the room.

Here's Intel's hint about how this would play out:

Students in the sessions were asked to work on the same online course work. Instructors, armed with a dashboard providing real-time engagement analytics, were able to detect which students required additional 1:1 instruction. By identifying a student’s emotional state, real-time analytics helped instructors pinpoint moments of confusion, and intervene students who otherwise may have been less inclined or too shy to ask for help. In turn, empowering teachers and parents to foresee at-risk students and provide support faster.

In a real classroom, teachers can gauge student reaction because the teacher is the one the students are reacting to. But if students are busy reacting to algorithm-directed mass customized delivered to their own screen, the teacher is at a disadvantage-- particular if the teacher is not an actual teacher, but just a tech there to monitor for student compliance and time on task. Having cut the person out of personalized [sic] learning, the tech wizards have to find ways to put some of the functions of a human back, like, say, paying attention to the student to see how she's doing.

The scenario depicted in the video is ridiculous, but then, it's not the actual goal here. This algorithmic software masquerading as artificial intelligence is just another part of the "solution" to the "problem" of getting rid of teachers without losing some of the utility they provide.

Intel, like others, insists on repeating a talking point about how great teachers will be aided by tech, not replaced by it, but there is not a single great teacher on the planet who needs what this software claims to provide, let alone what it can actually do. This is some terrible dystopian junk.


  1. That is horrifying. Not least because the next step is to aggregate all that juicy data and score teachers based on it. So if you teach English learners (who need more time to think before responding) or kids with cultural, behavioral, or physiological differences, you're in big trouble (as usual).

  2. Don't worry people; really BAD ideas are the first to fall.

    1. Sorry...but NO!. We still have Common Core and we still have standardized tests. There is still VAM and school/teacher ratings based on test scores. Bad ideas don't fall (or fail)....they just keep piling up and the a__hat reformers just keep going. It will never end until ALL the teachers make it end.

    2. Are you kidding me?! Ever heard of Reaganomics? Known also as supply-sided economics, privatization, neoliberalism, etc.? Been around for 30+ years and going strong, no matter how many lives it destroys.... Education rephorm is just one aspect of that, likewise going strong at least since "A Nation at Risk".

    3. This article is not about political/economic policy. AI in this format is about as BAD an education product as could be imagined.
      AI is DOA

    4. BAD ideas in the classroom fail faster than you can blink. Every teacher on the planet develops a survival instinct regarding ideas, policies, products, programs, methodologies, and especially, obvious fads. If it doesn't work with students - STOP doing it.

  3. There is also a PR push by google to label AI as creative...

  4. Common Core standards, PARCC/SBAC tests, VAM/SLOs have ALL FAILED.
    The problem with this group of conjoined triplets is that they were leveraged into place through federal LAW (NCLB). ESSA has given states a little more wiggle room, and sooner rather than later they will be relegated to the smoldering ash heap of edu-fails.

    The CCSS have been re-branded and tweaked by most states, yet if your kid is fortunate enough to have a fierce and fearless ELA teacher, all the bad will be conveniently ignored.

    The CC tests are only as distracting as your school district will allow (or encourage). Attend some BOE meetings and become a vocal advocate against any and all misuse of your kid's time in school related to the tests. Do not relent.

    VAM and its evil twin. the SLO, have been neutered by the realities of the system. There isn't a building principal alive who would trade the devils they know for the unknown. And that secret supply of highly effective teachers who are in hiding, have yet to be found. VAM and SLOs - they are the true ghosts of the reform movement

    The best way to render this CC, test-and-shame debacle is to ignore it and advocate against it if you have educators running the show who still haven't been able to stop drinking the Kool Aid.

    As far as subject of this article, AI is a truly STUPID BAD idea and it will fall/fail faster than you can say "silicon snake oil"

  5. The "personalized" sales pitch is now used to market almost every service--banking, insurance, car care, your online interests (especially ads) and now learning. The educational misrepresentation of "learning" as the outcome of computer programs with screen-based images is a fraud. Computers are an instructional delivery system, with no guarantee of learning. There is nothing personalized about algorithms "trained" to make predictions based on massive and hidden from view mining of biometric, biographic, and psychographic data from thousands of individuals. The worst part of this sham is the marketing of "blended learning" where computers do most of the delivery of instruction with teachers and students "permitted" to have some live interaction free of surveillance by technologies.