Pages

Monday, August 17, 2020

Silicon Valley and the Surveillance State

Peter Schwartz is an American futurist, innovator, author, and co-founder of the Global Business Network, a corporate strategy firm. He's done sexy things like consult for futury movies, including WarGames (ew), Minority Report, and Sneakers (an under-appreciated gem). He's written an assortment of books; he also wrote the 2004 climate change report that predicted that England would be a frozen wasteland by, well, right now. (This Peter Schwartz should not be confused with this Peter Schwartz, Ayn Rand-loving writer. )

Schwartz was the subject of an interview in yesterday's San Francisco Chronicle, reminding us that there's an entire sector of future-looking tech-loving folks who think the advent of the surveillance state is pretty swell.

Schwartz is not in Silicon Valley-- he's a Beverly Hills guy. And not everything he says is alarming. For instance:

Every single time, with no exceptions, that I’ve gotten the future wrong, it’s because there was an inadequate diversity of people in the room. It was not that it couldn’t be seen; it was that we were just talking to ourselves.

Technocrats desperately need to hear that, but the prevailing ethos is the idea of a single visionary CEO without other voices to hold him back. As in Zuckerberg's unwillingness to let go of control of his company or his money, or Reed Hastings' belief that school boards should be scrapped because they just get in the Visionary Leader's way.

But then the interviewer asks how our feelings about surveillance are "evolving," and, well, Schwartz doesn't dig very deep.

There will be times when it’s abused, when data is stolen, when people are harmed by it. But for 99% of the people, 99% of the time, it will mean that you didn’t have to show your ticket to get on BART; it means you didn’t have to check out at the supermarket; it means that when somebody stole your kid’s bike, it will have been seen. Oh, and that unhealthy people will be detected before I get on the airplane.

There's no question that folks hav e shown that they are more than willing to fork over huge amounts of personal data for a smidgen of convenience. Hell, people still insist on giving away tons of data just so they can take a "Which kind of exotic cheese are you" quiz on Facebook. But look at how quickly he skips past the down side, and characterizes it as the occasional bad actor, and not a dystopic system of surveillance and control. But here we arrive at an article of misplaced faith.

We’re now in a global village where the truth is everything can be known about everybody.

The truth is that we can collect a great deal of data and factoids about anyone, but that's not everything. This is like believing that if you know your spouse's height, weight, shoe size, favorite color, previous addresses, well, you know everything you can (or need to) know. This is exactly like believing that if you have collected a bunch of standardized test scores from a student, you know that student.

If we could just collect all the observable, quantifiable data, we would know everything about everything. So let us collect it all. Because it's going to happen anyway.

That's the Silicon Valley ethic, and it's wrong on several levels.

First, collecting all the data doesn't make one all-knowing. I'm not just talking about the whole "difference between knowledge and wisdom" thing, or pointing at romantic odes to human complexity and depth (though those things are true, too). Read up on Information Theory and Chaos science-- complex systems define specific, linear predictability. It doesn't matter how many facts you collect--you still can't predict exactly what comes next.

Second, get your paws off our data. Better yet, if you want it, pay for it. If we're imagining our favorite futures, I'd like to imagine one in which customers don't pay for the privilege of being data mined. It's not just that data mining is invasive and obnoxious--the current practitioners are still really bad at it. Feeding that bad data into systems yields bad results.

Third, it's not inevitable. Tech folks--especially ed tech folks--invariably present sales pitches in the guise of future predictions. They are wrong, a lot, in part because they are the techno version of the used car salesman saying, "I can just see you driving this baby out of here." No. You hope to see that, but right now, it's just a sales pitch.

The surveillance state won't be a happy utopia occasionally interrupted by the blip of isolated bad actors. The big use of data is to help mold and direct the behavior of the masses, and the two big motivators for that kind of nudging are 1) the desire to make money and 2) te desire to acquire political power, and we've already seen both in action.

The surveillance state will continue to come after schools, because how else do you gather All The Data except by starting early? Schools are easily seduced partners because too often some folks in charge (and some, sadly, in the classroom) are attracted to the idea that they could get so much more done if they had more data, more control (this is true for public, charter, and private schools). One of the decisions that educational institutions must make, late as it is in the game, is whether they want to help the data miners or become protectors of student data.

They should be protectors. When buying a program, they should require that everything collected will stay within the district's system, to be easily scrubbed when the students leave or graduate. When subscribing to an online service, they should demand ironclad assurances that student data will not be shared (not even with "trusted partners") or kept after the student graduates. Schools are where the foundation of the surveillance state will be laid; schools should be actively and deliberately making sure that foundation doesn't get built.


No comments:

Post a Comment