Pages

Tuesday, April 6, 2021

Can You Fool An AI Emotion Reader

As we have seen numerous times, there are software packages out there that claim the ability to read our emotions. Folks are lined up around the block to use this stuff, but of course one of the applications is supposed to be reading student emotions and therefor better at "personalizing" a lesson. 

Does this sound as if the ed tech world is overpromising stuff that it can't actually deliver? Well, now you have a chance to find out. Some scientists have created a website where you can practice having your own face read by software.

The team involved says the point is to raise awareness. People are still stuck on all the huge problems with facial recognition, but meanwhile, we're being surrounded by software that doesn't just recognize your face (maybe) but also reads it (kind of). Here's the project lead, Dr. Alexa Haggerty, from the awesomely-named University of Cambridge Leverhulme Centre for the Future of Intelligence and the Centre for the Study of Existential Risk:

But Hagerty said many people were not aware how common emotion recognition systems were, noting they were employed in situations ranging from job hiring, to customer insight work, airport security, and even education to see if students are engaged or doing their homework.

Such technology, she said, was in use all over the world, from Europe to the US and China. Taigusys, a company that specialises in emotion recognition systems and whose main office is in Shenzhen, says it has used them in settings ranging from care homes to prisons, while according to reports earlier this year, the Indian city of Lucknow is planning to use the technology to spot distress in women as a result of harassment – a move that has met with criticism, including from digital rights organisations.

The Emotion Recognition Sandbox let's you play with the software through your own computer camera. The site assures us that no personal data is collected and all the images stay on your own device. The site lets you play two games. One is only sort of a game, a sort of quiz that drives home the point that one of the things that software can't do is use context to decipher whether the human just winked or blinked.

But in the other (the Fake Smile Game, you pull up your own camera, and try to "register" all six basic emotions-- happiness, sadness, fear, surprise, disgust, and anger. 

I found it surprisingly difficult; I only got four out of six, missing fear and disgust. I later got disgust by accident when I was doing something better described as "trying to look at my keyboard when my bifocals were askew." 

I can't overstate how really bad the software was. It had no filter for sarcasm or obviously (to a human anyway) fake expression. I cannot swear that they didn't purposefully use bad software to make their point, and there's always the possibility that I'm just not British enough for it to work well, but watching that computer try and try, slowly, to decipher my face and not doing it very well, I had to wonder how in the world such a thing could, as some have promised, keep up with an entire classroom and provide a teacher with nuanced useful real-time readings of the emotions of students in the room. 

Go take a look and try your hand. Perhaps your face will work better than mine. At any rate, it's astonishing, and not in a good way.

1 comment:

  1. It is amazing how tech companies will spend billions of dollars trying to design software to do something badly that humans naturally do well.

    ReplyDelete