Saturday, February 24, 2018

Bogus Measures of Learning

There may be more ridiculous ways to measure education than "days of learning," but this bogus measure remains popular, particularly among charter cheerleaders.

CREDO studies often attribute, say, an extra 26 days of reading to charter programs. What the heck does that even mean? Which 26 days would that be? 26 days in September, because noticeably less learning gets done in that first month of school. Is it 26 Wednesdays? Mondays? Fridays? Because each of those days looks a little different in my classroom. And is that a day of First Grade or Tenth Grade? Is that a day for some sort of standardized student, or an average student? Do we correct for distractions, like a day on which a student is upset about some problem at home? How do we arrive at that metric for a single day-- take the gains that somebody somewhere says students are supposed to make from one year's test to the next and divide it by 180? Because, of course, there are some days on which no learning takes place at all (for instance, the days we spend taking that Big Standardized Test). Can we keep breaking this down-- can I talk about hours of learning or minutes of learning? Seconds of learning?

Most importantly, has anybody ever provided any validation of this kind of measure at all?

The answer, of course, is no. If you want the real researcher's explanation of what that means, I recommend this piece by Mark Weber (Jersey Jazzman), which as always makes the complicated sciency stuff clear.

So why should you call bullshit when somebody starts throwing the "days of learning."

Well, first, per Weber's piece:

The consistently small effect sizes have been pumped up by an unvalidated conversion into "days of learning" which has never been properly justified by the authors.

In other words, "days of learning" is a way of making a tiny little effect look like a big one, like saying you doubled my pay when you raised me from $0.10 an hour to $0.20. I recommend Weber's piece for a clearer sense of how much nothing is being converted to the illusion of something. But it's also worth noting that besides inflating the size of the effects, the "days of learning" dodge allows us to skip right past the question of whether the Big Standardized Tests are a valid measure of anything at all.

But there's another problem. The "days of learning" plays straight into the engineering model of ed reform, as laid out by semi-repentant reformster Larry Berger (CEO, Amplify):

You start with a map of all the things that kids need to learn.
Then you measure the kids so that you can place each kid on the map in just the spot where they know everything behind them, and in front of them is what they should learn next.

Then you assemble a vast library of learning objects and ask an algorithm to sort through it to find the optimal learning object for each kid at that particular moment.

Then you make each kid use the learning object.

Then you measure the kids again. If they have learned what you wanted them to learn, you move them to the next place on the map. If they didn't learn it, you try something simpler.

"Days of learning" assumes that education is like tofu-- any slice is like any other slice. It assumes that education is a steady journey down a fixed track in a constant-speed train-- X number of hours traveled always equals Y number of feet traversed. It's a comforting enough model and gives the technocrats of ed reform the comforting feeling that everything about education easily measurable and therefor technically manageable. The only problem with the model is that it doesn't have anything to do with how actual live humans function.

Live humans progress in fits and starts, plateaus and step climbs. Live humans travel on a million different paths to a million different destinations. Trying to talk about how many days of learning you gained in your education program is like trying to talk about how many ounces of love you added to your marriage or how many tubs of anger you emptied out of your basement. It's like trying to use a yardstick to measure history.

In other words, when you hear someone talking seriously about "days of learning," you can be sure you're listening to someone who doesn't know what the hell they're talking about.


4 comments:

  1. The attempt to quantify and compare teaching quality or effectiveness using standardized tests has always been a fool's errand. It's no more ridiculous than claiming I can compare the quality of my Caribbean vacation with my friend's Euoropean vacation using . . . thermometers! ("Hey Dave, my temperature readings in Aruba proved that I had 26 minutes of more fun per day than you did in France!"). The reformster's laundry list of false assumptions that they ignored, is long enough to invalidate every conclusions they have ever made. The obsession with putting a number on the immeasurable has made the last 18 years of test-threaten-and punish reform just one very expensive exercise in futility. Sad.

    ReplyDelete
  2. None of these people have ever met kids

    ReplyDelete
    Replies
    1. I think their problem is that they think that the 15 minutes they spent having an engaging conversation with their niece on Thanksgiving is all they need to understand children as students in schools. heck, even administrators and counsellors have this problem. It is their absolute ignorance regarding group dynamics, the grind of the 180 day school year, and familiarity fatigue that will always be their undoing.

      Delete
  3. Can we extend this to the learning efficacy of “block” scheduling?

    ReplyDelete