Pages

Tuesday, February 10, 2015

Sorting the Tests

Since the beginnings of the current wave of test-driven accountability, reformsters have been excited about stack ranking-- the process of sorting out items from the very best to the very worst (and then taking a chainsaw to the very worst).

This has been one of the major supporting points for continued large-scale standardized testing-- if we didn't have test results, how would we compare students to other students, teachers to other teachers, schools to other schools. The devotion to sorting has been foundational, rarely explained but generally presented as an article of faith, a self-evident value-- well, of course, we want to compare and sort schools and teachers and students!

But you know what we still aren't sorting?

The Big Standardized Tests.

Since last summer the rhetoric to pre-empt the assault on testing has focused on "unnecessary" or "redundant" or even "bad" tests, but we have done nothing to find these tests.

Where is our stack ranking for the tests?

We have two major BSTs-- the PARCC and the SBA. In order to better know how my child is doing (isn't that one of our repeated reasons for testing), I'd like to know which one of these is a better test. There are other state-level BSTs that we're flinging at our students willy-nilly. Which one of these is the best? Which one is the worst?

I mean, we've worked tirelessly to sort and rank teachers in our efforts to root out the bed ones, because apparently "everybody" knows some teachers are bad. Well, apparently everybody knows some tests are bad, so why aren't we tracking them down, sorting them out, and publishing their low test ratings in the local paper?

We've argued relentlessly that I need to be able to compare my student's reading ability with the reading ability of Chris McNoname in Iowa, so why can't I compare the tests that each one is taking?

I realize that coming up with a metric would be really hard, but so what? We use VAM to sort out teachers and it has been debunked by everyone except people who work for the USED. I think we've established that the sorting instrument doesn't have to be good or even valid-- it just has to generate some sort of rating.

So let's get on this. Let's come up with a stack-ranking method for sorting out the SBA and the PARCC and the Keystones and the Indiana Test of Essential Student Swellness and whatever else is out there. If we're going to rate every student and teacher and school, why would we not also rate the raters? And then once we've got the tests rated, we can throw out the bottom ten percent of them. We can offer a "merit bonus" to the company that made the best one (and peanuts to everyone else) because that will reward their excellence and encourage them to do a good job! And for the bottom twenty-five percent of the bad tests, we can call in turnaround experts to take over the company.

In fact-- why not test choice? If my student wants to take the PARCC instead of the ITESS because the PARCC is rated higher, why shouldn't my student be able to do that. And if I don't like any of them, why shouldn't I be able to create a charter test of my own in order to look out for my child's best interests? We can give every student a little testing voucher and let the money follow them t whatever test they would prefer to take from whatever vendors pop up.

Let's get on this quickly, because I think I've just figured out to make a few million dollars, and it's going to take at least a weekend to whip up my charter test company product. Let the sorting and comparing and ranking begin!

6 comments:

  1. Because Common Core has never been piloted, neither have the tests that allegedly measure it. Therefore, there's no possible way to determine which test is more effective 'cause we don't have any idea if the CC is effective. Catch-22s all over the place.

    ReplyDelete
  2. I'm sure Mike Petrilli would help you whip up a game plan and funding. It's "choice," right? ;-)

    ReplyDelete
  3. Isn't it like sorting poop? The best of the worst? Let's open up those test screen pages and see what are in the SBAC and PAARC...aren't we the taxpayers the consumers? Tell me again why we can't talk about what is in them, verify them get scores in a timely fashion. The companies don't want US to cheat....but wait? What about that pineapple question?

    ReplyDelete
  4. Let's use student test scores to rank the tests. As long as we're using the data in ways it wasn't intended. Companies with tests that have pass rates in the bottom 10% are no longer allowed to have any business with publicly funded schools.

    ReplyDelete
  5. Let's use student test scores to rank the tests. As long as we're using the data in ways it wasn't intended. Companies with tests that have pass rates in the bottom 10% are no longer allowed to have any business with publicly funded schools.

    ReplyDelete
  6. "Oooo, oooo, oooo, teacher," (wildly waving my hand in the air), " C'n I help write the test?" Seriously, now that I've stopped laughing, that's such a diabolically wicked idea. Swiftian.

    ReplyDelete