Monday, March 26, 2018

Big Brother Wants To Read Your Face

Imagine if you were presenting in front of a large group, and you could instantly get feedback on how you were doing. Perhaps you could read the body language of your audience or notice the expressions on their faces. Suppose you could check to see if your audience was understanding you, or following you, or happy or sad about what you were saying. Imagine that you-- oh, no, wait. You don't have to imagine that because you are a semi-intelligent functioning human being.

Let me start over.

Imagine you had computer software that could do all that for you.

A month ago, Inside Higher Ed reported on just such a chunk of software.

With sentiment analysis software, set for trial use later this semester in a classroom at the University of St. Thomas, in Minnesota, instructors don’t need to ask. Instead, they can glance at their computer screen at a particular point or stretch of time in the session and observe an aggregate of the emotions students are displaying on their faces: happiness, anger, contempt, disgust, fear, neutrality, sadness and surprise.

Well, that sounds... creepy? Unnecessary? Unlikely to be successful? Could you just mean in to your webcam for a second so I can see how this is going over?

I show the computer my finger and it doesn't know what the hell is happening
Maybe it's all the years I've spent as a hack musician, performing in front of all sorts of crowds, on top of all my years in a classroom, but I'm thinking that if you can't read the room then A) you might very well be a lousy teacher and B) you probably aren't nimble enough to respond to software that reads the room for you.

Python code captures video frames from a high-definition webcam and sends them to the Emotion interface, which determines the emotional state. Then that analysis comes to the Face interface, which returns the results, and draws a bounding box around the faces, along with a label for the given emotion. The Python code also sends the results to the PowerBI platform for visualization and retention.

Right. And just in case you think this could be helpful for someone teaching a huge class of 500, I'll note that right now the software tops out at 42.

There are technical issues, like "training" software to read emotions on human faces. Then there's the huge leap of logic that assumes that Pat's expression of disgust Then there's the well-documented problem of facial recognition software that can't recognize non-white faces. Then there's the problem of storing all that data and using it for other purposes, like evaluating teachers ("Sorry, Professor Bogswaddle, but your class turned up too many 'yuck' faces this semester")

Like many software developers of this ilk, the folks at St. Thomas’s E-Learning and Research Center (STELAR) have done some piloting, which itself raises questions. We're talking here about  Eric Tornoe, associate director of research computing in the information technology services department, team leader on this project:

Tornoe, his assistant and a part-time student employee have served as the software's three main guinea pigs thus far. In the process of "pulling faces" to test different emotions, the team found that surprise and anger were the easiest to perform and detect, while contempt was the trickiest.

The team also road tested the technology with unsuspecting audiences at staff meetings and presentations, according to Tornoe. (A spokesperson for the university said the staff meeting audiences were prepared to see a presentation about sentiment analysis, so they weren't caught entirely off guard.)

I'm worried about a class in which surprise, anger and contempt are prevailing emotions. And there is a huge creepiness factor with trying this out with "unsuspecting" audiences. But then, I'm betting that unsuspecting audiences are the only ones for which this would have a hope in hell at working.

Because what do you suppose happens when you tell a bunch of students that software will be monitoring, analyzing, and reporting on their facial expressions to the teacher? What happens when we tell a student, "The computer will be watching you the whole time to see if you understand." I am willing to bet that close-to-zero students respond, "Fine. I'll just carry on and behave as if the computer was not watching my every move."

I remember performing a simple experiment back when I was in high school. We decided that one side of the room would look at the teacher, looking engaged, smiling and nodding, while the other side of the room would act bored and disengaged. After a while, the teacher slowly moved over to work the side of the room that was giving her positive vibes. I can't imagine what fun students could have trying to mess with software. Actually, I can imagine some of it-- fake faces, playing Stump the Software, trying to manipulate the speed or direction of the class.

And that's still better than the other scenario I can imagine, which is the one in which the use of this software gives students blanket permission to be inert lumps. No need to be active and ask questions or join in discussion-- just passively let the software decide what the student is thinking, and she doesn't have to actually communicate anything.

The guys from Stanford working on a similar program suggest that it might be useful for students to see the readouts for the whole class, because... why? For students trying to game the system, that would be a great piece of realtime feedback. But otherwise, do students really need one more way tpo check and see if they're "normal"?

I have a hard time spotting the upside to any of this, other than I suppose it could help instructors who lack skills in dealing with carbon based life forms. But mostly it seems intrusive and creepy and enabling of al sorts of poor behavior. And, as is too often the case with ed tech, it appears to be in the hands of people who really haven't thought through even the most basic implications. Some folks gets it:

George Siemens, executive director of LINK Research Lab at the University of Texas at Arlington, applauds the institution for its focus on students' emotions, but he says he doesn't see why technology is necessary to perform a task at which humans are intrinsically more capable.

"I think we’re solving a problem the wrong way," Siemens said. "Student engagement requires greater human involvement, not greater technology involvement."

But other folks-- the folks pioneering this stuff-- don't get a lot of things:

Instructors won’t be able to see individual students’ emotions, either in real time or after the fact -- heading off any immediate complaints that the technology is invasive on a personal level.

Nope. The technology is still invasive on the personal level. Just because the instructor isn't seeing that level of response (other than by, you know, looking and using her human brain) doesn't mean the software isn't collecting it. Still invasive.

Tornoe and his colleagues haven’t yet decided how much students will know about sentiment analysis before they're subjected to it; university administrators have final say over that decision, according to a spokesperson.

Nope. This is not even a question. Failing to tell students that everything down to their facial expressions is being monitored and potentially recorded-- that's just flat out wrong. The fact that Tornoe and his colleagues even think there's a decision to be made shows how far divorced they are from the reality of what they're proposing. This is all kinds of a bad idea.


  1. And what about the people who have RBF (as my teen daughter has informed me....this is resting _ITCH face). What will be read on their faces?

  2. And how will it read students smiling, because they are on their phones?

    It's a joke.

    Actually useful, although still intrusive, would be software that scans to catch students passing notes, on their phones, talking when the teacher is talking etc. Physical behaviour monitoring rather than emotional.

  3. Just another weapon for use in the War on Teachers.

    I can just imagine how the conversation would go with a data-obsessed, hack administrator: "Mr. X, according to my software, only 6 out of 34 students were engaged in your lesson...ineffective!"

    Jolly good show.

  4. The parade of ed-tech failures seems endless. This may rank as the most stupid idea to ever escape the human mind.