Yesterday the Atlantic published an exceptionally helpful piece in the Science section by Robinson Meyer and Alexis C. Madrigal that offers some excellent explanation of why the nation has dropped the data ball for this pandemic. It's a good read from that perspective. But for education folks, there's more.
In the body of the article, Meyer and Madrigal share some observations about data, and the problems with data-driven anything; these points are important, and should be emblazoned on the office door of every data-driven follow-the-science policy maker and administrator in the country.
1. All data are created; data never simply exist.Before March 2020, the country had no shortage of pandemic-preparation plans. Many stressed the importance of data-driven decision making. Yet these plans largely assumed that detailed and reliable data would simply … exist. They were less concerned with how those data would actually be made.
|Here come the data|
But in our high stakes testing era, that has not happened (nor is it happening now). When the state says, "22% of your students are below basic in reading non-fiction," that's not a figure that descended from heaven in a burning memo. It's a number that was created, and everyone ought to be asking how it was created. Starting with a faulty instrument, converted from raw score to reported score somehow, then divided by cut scores that are determined after the test has been scored--just a few of the ways this goes wrong.
And right now, when folks are hollering that students have lost 57 days of learning during the pandemic, everyone should be asking how that data was created (spoiler alert: it was totally made up).
2. Data are a photograph, not a window.
This one most people in education get, sort of. The Big Standardized Test "is a snapshot of one particular moment" is a well-worn cliche, even among people who will then go on to argue that for some reason, that snapshot should be weighed as if it were a moment with far more weight than all the other moments that didn't make it into the photo.
3. Data are just another type of information.
There is some great, poster-ready, put-it-on-a-t-shirt stuff in this section.
Data seem to have a preeminent claim on truth. Policy makers boast about data-driven decision making, and vow to “follow the science.” But we’ve spent a year elbow-deep in data. Trust us: Data are really nothing special.
Meyer and Madrigal offer my new favorite definition of data:
Data are just a bunch of qualitative conclusions arranged in a countable way.
And add to that this important note:
Data-driven thinking isn’t necessarily more accurate than other forms of reasoning, and if you do not understand how data are made, their seams and scars, they might even be more likely to mislead you.
Meyer and Madrigal lay out some pandemic examples of when the data contradicted what scientists "knew" through other reasoning, based on their own expertise. In those times of contradiction, it was the data that were wrong. Teachers, of course, are regularly told in so many ways that their own assessments of students mean nothing when set beside the test-based data reports.
Would you like a nice analogy to wrap all this up?
Data are alluring. Looking at a chart or a spreadsheet, you might feel omniscient, like a sorcerer peering into a crystal ball. But the truth is that you’re much closer to a sanitation worker watching city sewers empty into a wastewater-treatment plant. Sure, you might learn over time which sewers are particularly smelly and which ones reach the plant before the others—but you shouldn’t delude yourself about what’s in the water.
Education has been overrun by the Cult of Data, and it's not unusual to feel intimidated by it. But I'll reiterate that I pulled these ideas about data from an article nominally about systemic failures in the federal response to a massive pandemic. Data is not magic, and educators should not bow at the data altar.
Post a Comment