Yes, every time you take a survey on Facebook, you open up access to your data. The range of reaction emojis help facebook more accurately track your mood and emotional reactions, making their data more detailed. Use gmail? Google reads your emails to better target you.
Now, imagine this same stuff applied to school. Imagine that school is redesigned so that every skill is developed and measured via computer, and so every data point is stored and added to a massive digital dossier on each child. Imagine that the school expands its curriculum to include social and emotional education, also managed by computer, so that the dossier stores information about what sort of person the student is.
The implications go beyond advertising. What would corporations pay to be able to say, "We need to hire ten left-handed white men who are good with simple computations, good reading comprehension skills, and who are very emotionally stable without any tendencies to challenge authority. Oh, and if they could be without any markers for possible major illness, that would be great. Send us a list." And, of course, the government could find ways to use this stuff as well. More efficient education ("Pat, your data so far indicates that you will be entering Sixth Grade for Plumbers next year") and advanced safety for communities ("Station a cop by Pat's apartment every day-- his data shows he's likely to blow up soon").
Over the past decade, we have adjusted to a new normal when it comes to privacy. The trade is not without appeal-- for a little less privacy, we get better service. Facebook doesn't show me ads for feminine hygiene products or recommend news stories from the Far Right. We give up some privacy to get more ready access to things we want. And we give it up in ways that are not obvious, so that we can remain pleasantly unaware of just how much privacy we are sacrificing. Big Brother, it turns out, is pretty warm and fuzzy and comforting.
But there are still places where we expect privacy to remain unbroken. If we logged on to Facebook and found our child's reading and behavior problems being discussed by our child's teacher, or if we found our doctor publicly laying out our health issues, we would be outraged-- and rightly so. I have my students write one-draft essays about personal topics-- not, as I tell them, because I want to know about their personal lives, but because it's a topic on which they are already experts. But because the topics are personal, I promise them that I will never show them to anyone.
But if I were requiring them to write those personal essays as, say, a Google doc, I don't think I could make or keep that promise.
Aren't there rules and laws that protect student privacy? Well, there used to be. The Family Educational Rights and Privacy Act was passed in 1974, but in 2008 and 2011, it was re-written by the USED to broaden the lists of people with whom school data could be shared. And they aren't done-- right now, the Data Quality Campaign and a laundry list of reformy researchers is calling for a further expansion of the holes in the FERPA privacy shield. The call, as is often the case, is in the name of research-- which is hugely broad term. "Can I use this data to figure out which students will make the best targets for advertising with a bandwagon approach?" is a research question. This list of four specific areas includes using data across the education and workforce pipeline, a concerning approach indeed. A call for looking at better capacity and security makes a certain amount of sense, now that school districts are recognized by hackers as soft targets. But it takes only a little bit of cynicism and paranoia to see it as a call for more foxes to perform tests on henhouse security.
These are not issues with simple solutions. Well, "nobody ever use any computers for anything ever again" or "take down the internet" are simple solutions-- just not plausible ones. We live in an age of technological miracles, and there is no going back. Nor would I necessarily want to. But I'm not ready to jump heedlessly into the Surveillance Society, either.
We need to make thoughtful choices. I teach at a 1-to-1 school; all of my students have school-issued computers, and I would never go back-- but I also don't make those computers the center of my classroom or instruction. And you're reading my blog that is housed on a Google-owned platform and which I promote over Facebook and Twitter. I'm guessing my digital dossier knows a thing or two about me.
I use technology, and I pay a price for it, and as with any ongoing shopping spree, I work to pay attention to how large the bill is getting. I use tech tools myself, and I use them with my students, and I make sure that they don't use us. (The correct approach is "Here's what I want to teach. Are there any tech tools that would help accomplish that" and never, ever, "Here's a cool tech tool-- how can I build a whole lesson around it.")
But there are levels beyond my control, and when I see things like another FERPA-weakening attack, I am beyond concerned. And if my school district were to jump onto the computer-centered competency-based personalized-learning bandwagon, I would take a vocal stand against it.
This is yet another area of education where you have to pay attention, do your homework, and pay attention some more. The new FERPA push is coming because Congress will be re-introducing the Student Privacy Protection Act, an oxymoronic title for an act that is about reducing privacy protections under FERPA. This peacekeeper missile of privacy is aimed at our children, but it's just arcane and obscure enough that most Americans will sleep right through this whole business. Now would be a good time to renew the effort to wake them up and explain the fuss.
Maybe it's because I don't really know what FERPA says to begin with, but it's hard to see from reading the summary that this new act reduces privacy.
ReplyDeleteThere's a sci-fi series written by Aaron Pogue called "Ghost Targets" that recounts a near-future America (and world) where a new form of relational database named Hathor has been built to handle all our information. The blurb for the series runs like this:
ReplyDelete"We abandoned privacy and turned databases into something like gods. They listened to our prayers. They met our needs and blessed us with new riches. They watched over us, protected us, and punished the wicked. We almost made a paradise."
It's basically a universal surveillance state where the databases provide everything tailored to your needs in seconds, but privacy is a thing of the past unless you can "ghost" and escape the surveillance, which is a crime of it's own.
Peter, are you familiar with http://www.digitalpedagogylab.com? They have some excellent articles along these lines, especially related to turnitin.com.
ReplyDelete