Wednesday, March 4, 2020

NWEA Offers More Testing Baloney

When a system doesn't work, you have a couple of choices-- you can address the problems that are causing failure, or you can insist that the original system is super-duper and start imposing new rules to try to work around the flaws in your original system. Like the latch that doesn't work properly, but instead of fixing the latch, you just teach everybody to lift and push the door to the side to get it to open.

The problem is particularly acute when your entire business is based on a failed model.

Absolutely not looking at latest MAP test
So here we are with another great piece of "research and thought leadership" from NWEA, the folks who bring you the delightful MAP test. You may be a state where the MAP is attached to your Big Standardized Testing machinery, or you might be like my old district where the MAP is used as a pre-test/practice as part of the test prep programming. Education thought leaders like the tests because they come in a fully packaged online latchkey operation. Log on the kids, let them take the "adaptive" test, watch the software spit out some numbers and charts which look really cool, even if gthey don't contain much actually useful information.

But I have to give credit to NSWEA for one thing-- they seem to grasp the most fundamental problem with these tests-- students have to actually give a rat's rear end about the test:

The best assessments are only effective and reliable if students are engaged and trying their best. But we know that’s not always the case.

That's the first paragraph in "What happens when test takers disengage?" and it may be the last thing that the article gets right.

Erin Ryan is the author; she's a senior content writer at NWEA who was previously a writer for Priorities USA, Upworthy, and Hallmark Cards. After graduating from University of Wisconsin with a degree in  Journalism and Mass Communications, followed by an MS in educational leadership, she did put in one year as a teacher in Duval County Schools.

Ryan points ouit that sometimes, students zip through a test, just marking answers without reading the questions. She says this is called "rapid guessing," though for many years my students called it "playing some ACDC." This leads to "unreliable results," though I think "meaningless bullshit" is perhaps more descriptive. She warns that these results don't tell you the full picture of the student capabilities and "maybe even land them in programs for interventions they do not need" and so, you shouldn't use these kinds of tests to make those judgements, and should probably scrap them in favor of testing instruments that actually yield useful results.

Ha. Just kidding. We're going to consider everything except the possibility that the very design of MAP testing-- multiple choice on a computer screen about dull and random reading mini-excerpts-- make it a test that actively disengages students and therefor a poor choice for any school trying to collect useful data.

The techno-delivery of the test, Ryan argues, makes it easy to catch the disengaged students and do something about it. Except that in this respect, she's going to offer some terrible advice. She also makes the claim that research into test taking behavior, "working hand in hand with technology," makes it possible to keep students more focused on a test. Also nope.

NWEA has been claiming that it can read students' minds for years, using pause time on questions to gaze into the student's soul. For a while now, MAP administrators have received litle notices on the home screen that tattle on "disengaged" students (just in case teachers couldn't notice a studentn ripping through twenty questions in two minutes, or failed to use their power of "looking" to see students who are bored and disengaged).

But NWEA has moved beyond that. They previously introduced Slow Down Sloth, a cartoon sloth that would pop up and encourage a student to slow down. It's a nice consolation  to those of us who feel sad that young folks will never get to meet Microsoft's Clippy. Now NWEA has auto-pause, which penalizes a racing student by freezing their test. This strikes me as an intriguing way to train students in how to figuyre out just how quickly the software will let them zip through the test.

But key is proctor intervention, and here's where we vreally run into trouble. Ryan drags NWEA researcher Steven Wise into this. Speaking about proctors:

“They think they are not allowed to intervene,” Steve says. “But that’s exactly what we want them to do. If a student is disengaged, you should do something about it.”

Here's the thing. They think they are not allowed to intervene because in many states, such intervention is absolutely against the rules. Pennsylvania teachers have to comply with a whole set of "test administration ethics" that are absolutely clear that a proctor cannot interact with students beyond reading the instructions script. Yes, if you're just using NWEA testing as a test prep tool, those rules don't technically apply, but why do test prep under different conditions than the "real" test? So this kind of advice...?

Maybe a student is struggling with a test because they’re not feeling well, are anxious, or are having trouble understanding the questions. Whatever the reason, when a proctor and student can talk when disengagement has occurred, instead of after, there’s an opportunity to save a testing event that might otherwise go to waste.

Nope. That sort of thing is absolutely verbotten, in part because, in its own way, it can invalidate a test almost as badly as playing ACDC.

Bottom line? NWEA has a product problem; the MAP is test is intrinsically disengaging, and is often used in settings (such as my old school) where it has no connection to the actual course and has less-than-zero stakes for students. It's multiple choice, which makes it easy for software to score, but a lousy measure of any complexity or depth of student understanding. Those are also the least engaging type of question, requiring no student response beyond "just pick a letter." The end result is a test that provides very little information. In the years that I gave the test, I never once found a student result that surprised me by telling me something I didn't already know (my personal number crunching also told me that it was a lousy predictor of Big Standardized Test performance).

NWEA's response to all these problems is not to go back to drawing board or question the foundational assumptions behind their product. Instead it's to offer these little help articles and webinars in order to get customers to plug the holes in their product. It's like an auto manufacturer saying, "We've screwed up the engineering of the airbags in these cars, and we'd like to give you some instructions about how to sit kind of side-saddly in the front seat so that the airbags kind of work."

Sending teachers instructions on how to tweak a faulty product so that it's marginally less faulty is not the solution here. NWEA needs to do better.

4 comments:

  1. I don't think that the test-masters have any idea about the degree to which students simply do not give a rat's butt about these no-stakes tests. There is not a student alive who did not come of school age in the test-and-punish era. We keep telling them how important these tests are, yet most of them don't even know their scores (nor do they care that they don't). Once a student hits MS, the spring testing season is nothing more than a break from the every day.

    ReplyDelete
  2. All BS Test cheerleaders should read The Lottery by Shirley Jackson.
    If you don't get the metaphor, here is something that may help:

    In 'The Lottery' Tessie Hutchinson represents __.
    a) teachers
    b) parents
    c) reformers
    d) students

    In 'The Lottery' the stones represent __.
    a) money
    b) No. 2 pencils
    c) tests
    d) opt outs

    In

    ReplyDelete
  3. I administered the NWEA during my last few years of teaching before retiring from the RI School for the Deaf in 2011. inane short paragraphs with inane multiple choice questions--the results provided no useful information. One of my students was disengaged from learning the entire year. She supposedly made progress. Another of my students was one of the most gifted and accomplished in the school. Her scores went down. totally random waste of time--but the bells and whistles!, and virtually immediate feedback! Prior to the school adopting this scam of an assessment, I had been given the responsibility to assess students in reading comprehension to establish IEP goals. I spent hours with the students administering appropriate tests and writing up a detailed narrative of the student's strengths and weaknesses. But that was considered a waste of time. For a number of reasons, not the least of which that our school was labelled a Persistently Lowest Achieving School by the RI Department of Ed primarily based on scores on the state assessments (this was before PARCC), I retired. I heard that several years later the school abandoned the NWEA. I just remembered that we had a professional day given over to a hard sell from an NWEA rep. We were inundated with pages and pages of material and quite a song and dance. One thing (well, not only one) that didn't make sense to me was how they arrived at the predicted growth. Our students at the RI School for the Deaf were uniquely unique--degree and age of onset of hearing loss, language of the home, immigrant status, family use or not of sign language, etc. I couldn't believe that any AI could figure out what would be predicted growth. Also, the presenter said that they expected younger students to make more rapid growth than older students, if I remember this correctly. I do remember that I asked a question about this--many of our students in middle school and even high school were still in the process of mastering English grammar and vocabulary, and reading was an additional challenge. Their initial progress would be slow until they caught on and then could develop more rapidly. blank stare from her. Another odd selling point for the NWEA was that other schools for the deaf were using it. We asked to be able to compare our results to students at other schools for the deaf, but that would be too expensive or something. RI has a Deaf Child's Bill of Rights. Students are not supposed to be given a test that has not been normed on deaf and hard of hearing students. Obviously, this one had not been, or had any of the other mass administered standardized tests, like the PARCC. If you remember the Stanford Achievement Test, that had actually been normed on deaf and hard of hearing students from around the country. It wasn't a great assessment, but at least it could give teachers and parents a general idea of how the students were doing compared to their deaf and hard of hearing peers. But now, that is anathema--everyone has to be held to the same high standards, and tested in the same way. Nevermind that the results provide no useful information and only serve to further stigmatize students who have enough difficulties already. (You can see that this is a very sore point with me.)

    ReplyDelete