I wanted it to be clever.
I wanted it to be surprising, enticing, well, at least a little bit human.
After all, AI companies are always telling us how much better than the human equivalent their creations truly are.
So when McDonald's revealed it was testing the idea of replacing humans at the drive-thru with robots, I was filled with cautious optimism.
Would customers be greeted with a surprisingly chirpy voice, redolent of a young person who really enjoys high school?
None of that. You can watch the clip here. The AI sounds like nothing so much as HAL 3000's sister; it is not a voice you would ever, ever mistake for a human.
But does it work? Well, McDonald's CEO noted that the AI system would require staff to be retrained not to do their jobs, because they were interrupting Discount Siri to try to help. But the humans can't all be fired yet, because the system, even working from a limited menu, is only about 85% accurate.
It also, apparently, gets the company sued. One customer has sued the company for a violation of Illinois's Biometric Information Privacy Act (BIPA). Passed way back in 2008, BIPA says you can't record information like voiceprints, facial features, or fingerprints without getting permission first. The AI ordering system records the customer voice in order be sure it gets the order right.
Well, not just to get the order right. The voice recording, according to the lawsuit, is collected "to be able to correctly interpret customer orders and identify repeat customers to provide a tailored experience." Which fits, because that personalization company that McD's bought is about making AI menu boards "that can change the offerings based on your personal ordering history, the weather, and trending menu offerings."
Just imagine this model applied to a classroom, complete with less-than-100% accuracy and a massive violation of privacy, not to mention collecting all that data that can be so valuable to a company. One more batch of reasons that classroom AI is a terrible idea.