Tuesday, December 20, 2022

Today In Surveillance State News...

Kelly Conlon wanted to go see the Rockettes with her daughter and the rest of her daughter's Girl Scout Troop. She did not get to. All of following reported on CNBC.

She did not get to, because Madison Square Garden Entertainment recognized her via facial recognition software scanning, apparently, everyone passing through security. She was flagged because she is an attorney at a firm that is working on a lawsuit against a restaurant currently under the MSG umbrella. They just loaded the names and faces of all the folks at that firm into their facial recognition software, and then barred them from entering any of their properties. Note that she does not work on their actual case or practice in their actual city--she just works at the same firm.

MSG says, hey, it's just policy that nobody involved in a case against them can be on any of their properties. A partner at Conlon's firm says, “This whole scheme is a pretext for doing collective punishment on adversaries who would dare sue MSG in their multi-billion dollar network.” And supposedly the courts have already decided on this issue with a different blacklisted firm, making it clear that MSG can't do this. 

The possibilities here are endless. Imagine if you just loaded the information of every LGBTQ person you could to keep them from entering your building and thereby infringing on your right to freely exercise your religion (by, I don't know, filling the air you breathe with LGBTQ cooties). 

And let's put this together with Kansas City schools, where the board is contemplating spending its COVID relief funds on putting a camera in every classroom. This is a district with a teacher shortage, with multiple classes taught via livestream distance teachers, which is the story they're using here--we just want to tape lessons so that we can play them back in classrooms that don't have teachers.

Which is its own kind of nuts. My first question is, will teachers whose recorded lessons are used be paid some sort of royalty for the use of their likeness and recorded work? 

Teachers feel disrespected and students feel policed. Because, well, they are.

“A lot of us, maybe a lot of us minorities because we come from Black and Mexican households, we’re going to feel like, even though they’re telling us this is to learn, they’re actually trying to watch us. They’re trying to monitor our behavior,” said Damarias Mireles, a 2020 Wyandotte High School graduate. “So it kind of adds to that stigma, even if it’s not the intention.”

While Stubblefield said surveillance could be a “byproduct” of having the cameras, she said the purpose would be for learning. She said video footage is currently only reviewed when a specific incident is reported.

Does byproduct surveillance feel less intrusive than when the surveillance is the primary objective? Is it reassuring that the school says the footage is reviewed when administration feels the need.

Place your bets now on how long it will be until some authority shows up at school saying, "We have some footage of a teenaged suspect, and we'd like to run your videos through some facial recognition software to see if we can find the suspect in one of your classes." How long until somebody says, "You know, as long as we've got this video feed going, let's attach it to one of those cool software programs that assesses potential threats by measuring eyebrow twitches."

How long until someone says, "Heck, let's just use facial recognition to run all outstanding warrants or people on our Suspicious Person list against everyone who sets foot in the school--not just students, because we might catch a miscreant picking up their kids at school." How long until some kid ends up in serious trouble only because the facial recognition software screws up. 

It is easy to dismiss this kind of thing with sentiments like "Yeah, now parents will see their kids messing around in class," but the sheer power of surveillance software, the many many things that can be done once your privacy is violated, is too scary. And how far does this road stretch. "Sorry, but we can't offer you the job. Facial recognition connected you to some shenanigans at school when you were 15." 

The security team at MSG knew Conlon's name, where she works, and presumably everything else connected to her record, and they made choices about her life and ability to move freely based on what the facial recognition software kicked up. I shudder to think what this could do in the wrong hands, and I struggle to imagine what the right hands could even be.


No comments:

Post a Comment