Monday, February 28, 2022

Study: Test Data Does Not Help Students Raise Test Scores

Today in "Things Teachers Have Been Saying For Twenty Years But Are Now Being Verified By Research," we present Heather Hill, a professor of education at the Harvard Graduate School of Education. 

Hill has recently publicized some of her recent study which somehow combines the obvious with dubious conclusions. Here she was at EdWeek back in 2020:

Question: What activity is done by most teachers in the United States, but has almost no evidence of effectiveness in raising student test scores?

Answer: Analyzing student assessment data.

This practice arose from a simple logic: To improve student outcomes, teachers should study students’ prior test performance, learn what students struggle with, and then adjust the curriculum or offer students remediation where necessary. By addressing the weaknesses revealed by the test results, overall student achievement would improve.

Oh, look--the sun is rising in the East

Well, the "simple logic" was never simple nor logical. Or rather, this is what teachers already do with their own testing. What was actually proposed was that students take a poorly designed test, followed by providing teachers with very little data, much of it bad, in order to raise test scores. Teachers knew pretty much immediately that A) this was not going to work and B) wasn't even that great of a goal.

Hill has some thoughts about why using test data hasn't improved anything. They are not great thoughts, and we get the hint in the very next sentence.

Yet understanding students’ weaknesses is only useful if it changes practice.

Hill is not the only researcher to note the "problem" and mis-diagnose the "solution." Hechinger recently talked to Hill and two other researchers who "explained" that "while data is helpful in pinpointing students’ weaknesses, mistakes and gaps, it doesn’t tell teachers what to do about them."

So here are the issues that these researchers have missed.

1) Raising test scores is a lousy educational goal. There is no research to suggest that raising a student's score will improve their life outcomes. Nor is there any research to suggest that the tests are actual measures of educational quality or actual student achievement. This is a good time to recommend, yet again, Daniel Koretz's The Testing Charade. Testing data continues to be exemplified by that story of the drunk searching for his lost car keys not where they were lost, but under a streetlamp because the light is better there. 

2) The tests yield little useful data. Testocrats love to talk about these tests as if they yield all sorts of rich data. They don't. Their validity--aka their ability to measure what they claim they can measure--is unproven. And multiple choice questions are great for machine correction, but not great for measuring any level of deep understanding.

More importantly from the classroom teacher standpoint, the tests are a black box. Teachers are forbidden to see the questions or the answers, and so the data is just a score. In my own classroom, with my own tests, I would operate much as Hill describes-- give the test, then break down the wrong answers to see exactly what kinds of mistakes students are making. None of that is possible with the Big Standardized Test--from those I would get things like a single score on "Reading Nonfiction." Test manufacturers have whipped all sorts of pretty graphs and colored charts, but the data is still meager and thin.

3) I can't just walk by one of the assumptions of this whole approach is that teachers either can't or won't do their jobs, and so some system of carrots and sticks must be devised to get them to do the work that they signed up for. Hill suggests a picture of teachers who just keep doing the same thing over and over, as if teachers are not motivated or capable of searching out other techniques and approaches. 

Hill is correct in noting that the infamous Data Meetings imposed on teachers by all sorts of data-loving administrators aren't helping. Again, not news. But when you've got bad, thin data that you're supposed to apply toward a pointless goal, what can you expect.

Data-loving testocrats have all along insisted that those darn teachers just don't want to use data properly. But teachers collect, crunch and act on data on a daily basis (though they don't always turn it into numbers and charts). What testocrats seem unwilling to admit, accept, or even see, is that the BS Tests offer little useful data for the process. 

Likewise, the whole "someone should show teachers the better way to teach these things they aren't teaching" always seems to break down when it's time for edu-amateurs to show teachers how to do their jobs better. Hill says that "teachers need to change their approach to address student misunderstandings," as if all teachers use one approach, though she names neither the approach they use or the one she thinks they should use. 

Nevertheless, it appears this earthbound equine will continue to be reflogged. Hill's appearance in Hechinger was prompted by a presentation at the "newly formed" Research Partnership for Professional Learning, yet another group dedicated to fixing teachers. Members include Teaching Lab and TNTP, and sponsors include The Walton Family Foundation and the Bill and Melinda Gates Foundation. So expect what we've had for decades--attempts to explain the failure of beloved high-stakes standardized testing data-driven eduication, blaming everything except the fundamental flaws in the approach.






4 comments:

  1. Dang! Does this mean we all have to take down our Data Walls?

    ReplyDelete
    Replies
    1. No doubt. I'm sure ed policy on state and local level will follow the science.

      Delete
  2. Of course, the big standardized tests don't help students' learning/raise achievement: the results come back months after students take the tests and the questions don't provide any insight into how students chose the answers they did or, in the case of math, how they went about solving the problems. They are useless for teaching purposes. The way I see it, after more than 50 years in the field, is that all this data is used to put down public education and teachers and to pit schools and teachers against each other in some sort of crazy competition that is meaningless.

    ReplyDelete
  3. The endless quest for data through standardized tests is easy to explain. At the core, it has nothing to do with education. It everything to do with the fact that data can be turned into charts and graphs, which politicians and 'testocrats' adore. So politicians can pretend to understand the problem and affix blame, all while remaining clueless.

    ReplyDelete