Let me remind you why you can still afford to be unimpressed.
First, even if you do not have first hand experience with NWEA's MAP test (I do--it's a lousy computer-delivered multiple choice test), you should always remember that one of their attempts to make some edu-bucks centers around their program that they promised can read minds by measuring how long it takes students to pick a multiple choice answer.
Second, you know that this "report" is baloney because it leans on that great imaginary measure, the "days of learning." Students during the pandemic will "lose" X number of days of learning. "Days of learning" is actually a measure that CREDO made up themselves, based on some "research" in a 2012 paper by Erik Hanushek, Paul Peterson, and Ludger Woessmann. And if "days of learning" seems like a bizarro world way to measure of education (Which days? Days in September? Days in March? Tuesdays? Instructional days, or testing days, or that day we spent the afternoon in a boring assembly? And how does one measure the amount of learning in a day, anyway?)-- well, here's the technical explanation from that paper:
To create this benchmark, CREDO adopted the assumption put forth by Hanushek, Peterson, and Woessman (2012) that “[o]n most measures of student performance, student growth is typically about 1 full standard deviation on standardized tests between 4th and 8th grade, or about 25 percent of a standard deviation from one grade to the next.” Therefore, assuming an average school year includes 180 days of schooling, each day of schooling represents approximately 0.0013 standard deviations of student growth.
In other words, "days of learning" is really just one more way to whomp on student Big Standardized Test scores with some fancy maths.
Third, NWEA's "report" didn't measure a damn thing. They guessed. They looked at some results from their own testing product, the above-mentioned MAP, tried to squeeze out some results about how much scores slip over the summer, and then they just, well, guessed. They hid that guesswork behind some fancy argle bargle like this:
To provide preliminary estimates of the potential impacts of the extended pause of academic instruction during the coronavirus crisis, we leverage research on summer loss and use a national sample of over five million students in grades 3–8 who took MAP® Growth™ assessments in 2017–2018.
Third, NWEA's "report" didn't measure a damn thing. They guessed. They looked at some results from their own testing product, the above-mentioned MAP, tried to squeeze out some results about how much scores slip over the summer, and then they just, well, guessed. They hid that guesswork behind some fancy argle bargle like this:
To provide preliminary estimates of the potential impacts of the extended pause of academic instruction during the coronavirus crisis, we leverage research on summer loss and use a national sample of over five million students in grades 3–8 who took MAP® Growth™ assessments in 2017–2018.
Some ed writers at the time called this an analysis of "student acheivement and growth data." It wasn't. It was a guess. And really, who can blame them--we were and are in unprecedented times, so guessing was all anyone could do. The problems come when you start trying to dress your guesses up in hard science, and pile that on top of the continued fallacy that student scores on a narrow two-subject multiple choice test are somehow a good proxy for "student achievement."
CREDO's endorsement of this non-research doesn't help. From them we get this kind of thing:
"Scientifically grounded estimates" certainly sounds better than "wild ass guess by someone in a lab coat."
Fourth, while I totally get the general panic over covid downtime, the truth is, nobody is "behind" because the imaginary line that students must cross by a certain moment or else, apparently, Very Bad Things will happen-- that's an imaginary line. You can tell it's imaginary because we move it whenever it suits us for policy reason, like a decade or so ago when we suddenly decided that kindergartners who couldn't read, write and compute were doomed to a life of eating cat food off a hot plate while living in a van down by the river. Remember-- we are not talking about education; we are talking about test scores, and there is every reason to believe that raising those test scores has zero effect on a child's life prospects.
Likewise, the repeated reference to learning "loss"-- what does that even mean? How does learning get "lost," and how do you measure the lostedness? Is it like the time I lost my car keys? Yes, I know there are things I learned in Algebra II that I can no longer call up in my brain, but is that "loss" exactly, or just a natural mental process, like the way I lost my knowledge about what I had for lunch on September 3. Now tell me how you measure any of that with a straight face.
Likewise, the repeated reference to learning "loss"-- what does that even mean? How does learning get "lost," and how do you measure the lostedness? Is it like the time I lost my car keys? Yes, I know there are things I learned in Algebra II that I can no longer call up in my brain, but is that "loss" exactly, or just a natural mental process, like the way I lost my knowledge about what I had for lunch on September 3. Now tell me how you measure any of that with a straight face.
So why has CREDO decided to throw its weight behind this baloney? Well, the testing industry is in a bit of a stir right now. The BS Test was canceled last spring, and nobody is very excited about bringing it back this year, either. So the testing industry and their reformy friends are trying to sell the notion that students and schools and teachers are adrift right now, and the only way anyone will know how students are doing is to break out the industry's products and start generating some revenue data. Here's a line from the CREDO release:
The analysis emboldens policymakers and educators to employ diagnostic and progress assessments to inform how well efforts work and to help students and communities recover and move forward.
In other words, this NWEA "report" should help sell some tests.
Look, there is no question at all that taking a six month break from school has thrown a spanner into the scholastic gears. But as I keep saying-- figuring out where students are, exactly, is what classroom teachers do every single fall. Yes, teachers expect this year's results to fall a little bit outside the usual parameters, but this is the gig--find out where students are, meet them there, bring them forward.
The fact that students' relationship with teachers and school has been strained and is still being strained is all the more reason NOT to start in with the testing right now. But no-- a teacher on Twitter today posted that thanks to having to force students to take the MAP, she got to see a student cry on Zoom.
So here's the bottom line-- when someone feels the urge to bring up the "research" about the Covid Slide, do ask them exactly what that research was, how it was done, and how anyone can determine exactly how many "days of learning" students have lost. Because it is a nothingburger, a new wrapper for the same old test score baloney, and the wrapper is made out of guesswork, not any kind of scientific measurement.
And in the trenches, real teachers are all too familiar with the 24 hour slide. As in the next day, follow up review being met with blank stares as if they weren't even there. Ha! They'll never get what school is really all about.
ReplyDelete