Pages

Thursday, May 11, 2023

The Pandemic and The Testing Red Herring

The pandemic exacerbated some of US education's major problems, and that's reflected in the effects on children. Unfortunately, it's also reflected in how we respond to those effects; policies that were a bad idea in "normal" times become terrible ideas in a post-pandemic world.

I have no doubt that learning loss is a real thing, and we need to address it. But I'm afraid that instead we're throwing weight behind Learning Loss instead.

Look, we all know there was learning loss, that students did not get the usual amount of learning and growth done during the pandemic, hampered by cobbled-together distance learning set-ups, general stress, and a disconnection from education in general. 

But Learning Loss is something else; it's a slightly panic-stricken doubling down on the Big Standardized Test, the single biggest policy failure in education of the last thirty years. 

The panic is predictable for many reasons, not the least of which is that the pandemic suspended the Big Standardized Test, which threatens the income of lots of folks in the testing industry. 

There are people who really believe in the importance of the BS Test, who have always believed that we can make the pig fatter by weighing it repeatedly. In the economism-dominated world of ed research, we get the regular assertion that testing results are linked to future life outcomes. Well, not exactly. Test results correlate with future life outcomes. This is not a surprise. Test results correlate with the socio-economic back ground of the students. Future life outcomes also correlate with the socio-economic background of students. 

What's missing--what has always been missing-- is any research showing a relationship between changing test scores and changing life outcomes. We know that if Pat has a high BS Test score, Pat is probably going to have swell life outcomes. What we don't know is this--if Sam was going to get a lousy BS Test score, but the school gets Sam to score higher, will Sam's life have more swell outcomes than it would have otherwise? Probably impossible to prove, but given the fact that we can accurately predict a school's test scores with just demographic data, it's the only question that matters.

Other arguments-- like dropping test scores mean a loss of millions of dollars of income--are based on even flimsier reasoning

The testocrats have pulled off the neat trick of getting people to debate and clutch pearls about BS Test scores without even knowing they're doing it. They do this by using some mathy prestidigitation to turn test scores into days/weeks/months/years of learning. When test fans Tom Kane and Sean Reardon get space in the New York Times to push the panic button over Learning Loss and print charts showing how many years or months students are "behind," all they're talking about test scores. When they and others talk about "catching up" and "making up lost time," all they're talking about is getting test scores back up.

This is a lousy focus, a misguided response that uses a made-up crisis to take attention away from a real one.

I'm partway through Anya Kamenetz's new book The Stolen Year, which catalogs the many ways in which students were pummeled, hurt, beat up, deprived, cut off and generally batted about by the pandemic and our responses to it. 

On the long list of things that students need to deal with in the aftermath, both educational and non, "get test scores back up" doesn't even make the top ten.

We got children being carted into the ER, battered, bruised, bloody, and a bunch of folks are hollering, "First, we've got to get these kids some clean shirts. And maybe a nice hat." 

I get the whole "it's the only concrete data we have, so what else are we going to use" argument. Pro-test folks have been using this argument forever, just like the guy who's searching for his car keys and night under the street light that is 100 feet from his car because that's where the light is best. Is testing data really better than nothing at all? Probably--but only if we approach it is an only-sort-of-accurate tiny slice of a larger picture. In other words, if we discuss them as scores on a single standardized test given to students who are out of practice in taking standardized tests, and not as some magical measure of the complete state of student learning. 

The current state of learning loss (not Learning Loss) is complicated. Beyond reading and math, there are subjects that took an extra hit, like those that require group work (chorus, band) or hands on work (the CTE stuff). And we're seeing widely that many students simply lost the knack for (or interest in) "doing school." Every community was hit differently, with some fielding far more trauma than others.

It's the people on the ground who know the most about what the students in their school need, and once again we run into one of the problems of the testocratic approach-- an attitude of "we don't need to talk to people who are there because we have all these numbers we can look at instead." Which is exactly backwards to what students and schools need right now. 

I'm doubtful that we'll prioritize the needs of students or the parents and teachers who work directly with them, based on our failure to do so when the pandemic was officially on. But "get those test scores back up" is a red herring, a beside-the-point exercise that will make far too many people feel as if The Problem is being addressed and they can stop worrying about it. Meanwhile, resources will be directed for some big herring hunt. 


2 comments:

  1. If we are going to split hairs on LL vs ll, then it is worth pointing out that the recent Harvard/Stanford study is largely predicated on MAP and not state tests. So I guess that just makes it a BS test and not a Big BS test, when used in the way those researchers did.

    ReplyDelete
  2. "There are people who really believe in
    the importance of the BS Test."

    and by default . . .

    These same people also believe in
    the importance of the small handful of brain deadening,
    Common Core language arts standards that are actually tested -and the validity of the BS Test scores produced
    by *psychomagicians with their arbitrary cut points and other mysterious machinations that can provide a parent here in NYS with this convoluted ELA score report for Molly:

    Reading: 23 pts./28 pt max
    Writing: 15 pts./16 pt. max
    which converts to an overall
    numerical SCORE of 625/658
    which places Molly at Level 4

    Sample report:
    https://www.nysed.gov/sites/default/files/programs/state-assessment/ela-3-8-score-report-english-2022.pdf

    (*A sub-species of the psychometrician)


    What I found just so astonishing is that the vast majority of these true believers, these people of bottomless faith, is that very few of them have ever taken the time to peruse the tests that they hold with such reverence. If only they took the time to pull the curtain back.




    ReplyDelete