Pages

Wednesday, November 23, 2022

Boston Globe Offers More Testocrat Cries of Anguish

Lost months of learning! Lost future income!! 

What the heck does that even mean.

The NAEP numbers have been used to manufacture all sorts of panic, and the privatization-loving Boston Globe has piled on, using a few of the more dubious arguments, starting right in by chicken littling the headline:

Don't click on it. You'll only hit a paywall, anyway. The research referenced is the work of Tom Kane, economics professor in the Harvard Graduate School of Education, who is interviewed for the piece and takes the opportunity to ring the urgency bell about "helping students catch up."

But let's review two of the most misleading pieces of "research" cited by the 3D crowd (disrupt, defund, dismantle).

Measuring learning in months/days/years is bunk

The analyses show that on average, Massachusetts students lost 75 percent of a school year's worth of math learning and 41 percent of a year of reading.

Any time someone measures learning in days, months, years, seconds, eyeblinks, etc, your bullshit detector should go off like smoke alarm at a forest fire. Let's think about this for a minute.

73 percent of a school year is roughly 131 days. Which 131 days are we talking about? Because days in September are not quite as learning-filled as days in, say, February. The days coming up between Thanksgiving and Christmas are not exactly well-known in education circles for being learning rich. Likewise, Fridays and Mondays probably aren't as learningful as a Wednesday. So which 131 days of learning are students short?

Of course, the answer is that days of learning (and months, and years) aren't really a thing. They're a made up way of talking about test scores. Turns out we're back to the gang at CREDO.

The Learning Policy Institute offers an explanation for days of learning. The short form is that a typical growth on a standardized test score, divided by 180, equals one day of learning. If you want a fancier explanation, LPI looks via CREDO to a 2012 paper by Erik Hanushek, Paul Peterson, and Ludger Woessmann:

To create this benchmark, CREDO adopted the assumption put forth by Hanushek, Peterson, and Woessman (2012) that “[o]n most measures of student performance, student growth is typically about 1 full standard deviation on standardized tests between 4th and 8th grade, or about 25 percent of a standard deviation from one grade to the next.” Therefore, assuming an average school year includes 180 days of schooling, each day of schooling represents approximately 0.0013 standard deviations of student growth.

So in the ends, we're really talking about test scores. Saying that students "lost" thirty days of learning is another, more compelling, way to say, "Test scores are down this many standard deviations from what we expected them to be." 

It's smoke, mirrors and misdirection, because "Your child's bucket of learning is empty" is scarier than "Your kid's score on this one reading and math test went down, boogah boogah." Lost instruction time is a thing. Lost days of learning is not. It's just a drop in test scores.

Of course, if your reaction to dropping test scores is "So what," there's another piece of questionable research to answer that.

Test Scores and Future Earnings

Economist Raj Chetty (it's always an economist) made a big splash with research that tried really, really hard to link test scores to future earnings. The sexy headline version you may recall was that a child who has a great kindergarten teacher will make more money as an adult. There's a full explainer of Chetty here, but whenever you see someone trying to link test scores to future earnings, you're usually looking at the work of with Chetty or Erik Hanushek or Tom Kane.

And that's what the Globe used for the "state level analysis performed by the Globe"--a paper by Tom Kane. The paper is not long, but it's thick with layers of nearly impenetrable economist argle bargle. But here, to the best of my ability to slog through the language, is how Kane et al figure that a drop in test score means less money as an adult.

They used "the mean [NAEP] score of 8th graders in a state as an estimate of the mean achievement of those born in the state 13 years before," then added scores for missing years by "estimating" then did some mathy things to adjust for race/ethnicity and parental education. Then they used census data by state to compute life outcomes (income, teen birth, arrest for violent crimes). Then they looked at changes in each. Then they pretended that there was some sort of connection between them. 

In other words, they said that from 1996 to 2019, Pennsylvania's math NAEP score went up 16 points. From 2001 to 2019, Pennsylvanian's mean income rose $1,600. Therefor, point on the math part of the NAEP equals $100 in future income.

That's an over-simplification, but not by much, and I'm not sure what you could do to that approach that would make it a non-ridiculous mess of assuming correlation equals causation. Kane and his co-authors add a bunch of math stuff that's supposed to correct for various factors, but it still looks like fried baloney to me. 

“We use these state-level differences in achievement gains on the NAEP along with outcomes by year and state of birth in the American Community Survey to estimate the association between past achievement increases and later-life outcomes.” Rather than NAEP scores, one could just as easily look at average height, or common hair colors, or shoe size. This seems like a good time to link to one of my favorite sites, Spurious Correlations.

Other highlights

Other research is cited, including a finding that spending more time in remote learning meant less math learning (aka test score) except when it didn't. Thanks. Very helpful.

Kane tries to make a case for saying that "underlying these tests scores are concrete skills" which matters because "you just can't skip calculus if you want to be an engineer" and "you just can't skip writing if you want to have almost any professional job" and good lord. The connection between test scores and concrete skills is a leap. The assumption that everyone should be tested as if they want to be an engineer is odd. The assumption that the NAEP (or any other standardized test) is a legitimate measure of writing skill is counter-reality.

There's a quote from Stephen Zrike, a school turnaround "expert" and current superintendent who may have the least silly comment in the whole piece:

We have a moral imperative and responsibility to support young people, not just in their academics. That's critical, but [so is] their ability to engage and enjoy their childhoods.

Which actually gets at the central problem with this kind of fear-mongering. It all boils down to an argument that, out of all the things that children experienced (and experience) during the pandemic, nothing is more important than the drop in standardized test scores, and the bulk of our meager resources must be focused on raising those test scores. 

All of this panic-button hammering is just the anguished screams of testocrats who, for a variety of reasons, would like us all to join them in thinking that nothing schools do is more important than getting students to score higher and higher on the Big Standardized Test (and its corollary-- believing that schools are failing because the score are low).

Late in the piece, the Globe returns to this theme

The analysis sheds light on the progress Massachusetts students had made on the NAEP over the 27 years prior to the pandemic--around two grade levels' worth of gains in math, or $65 billion in future earnings.

This is utter, fabricated nonsense. It's an attempt to stampede the crowd toward the exit marked "Testing" by hollering "Fire." It's not to be taken seriously. 

2 comments:

  1. How about this old story from my 3 nephews' "elite" district? Drug use is very high. And as the article says, the kids claim it doesn't affect their performance. Is this the elephant in the room we are going to ignore?
    https://sudbury.ma.us/police/2009/07/23/parents-help-sought-in-combating-drug-use-at-lincoln-sudbury-regional-high/
    I wonder what the stats are for the whole state? Do we have to be concerned about this or not?

    ReplyDelete
  2. My husband is living proof that test scores don't matter much when becoming an engineer. High school education, low test scores, Air Force electronics training. Self taught on the job. Did engineers job for 20 years and taught others.

    ReplyDelete