Pages

Monday, December 29, 2014

Teachers Declare EVAAS Useless Baloney

From Audrey Amrein-Beardsley at Vamboozled comes a report on a piece of research about SAS EVAAS, the granddaddy of VAM systems, beloved in several states including my own home state. Amrein-Beardsley has a guest post by Clarin Collins, author of the study and former doctoral student under Amrein-Beardsley (if you don't follow this blog, you should). I was interested enough to go read the actual paper, because in Pennsylvania we just lovvvvve PVAAS to pieces.

If you don't live in VAAS state, well, you're missing out on some of the fun. We have a nifty website where we are supposed to find out oodles of data about how we're doing, how our students are doing, and what is supposed to be happening next in our classroom. Periodically some of us are sent off for professional development to show us which nifty charts are there and how we can crunch numbers in order to achieve teacherly awesomeness. SAS (the owner-operators of this business) also include a database of terrible lesson materials, because that helps them sell the site. I have literally never met a single human being who used a lesson from the SAS site.

At any rate, when you hear reformsters talking about how data can be used to rate teacher effectiveness and help teachers design and tweak instruction, this site is what they think they are talking about. Funny thing, though-- prior to Collins's research, nobody went out to talk to teachers in the wild and ask if they were getting any real use out of VAAS, and so VAAS's reputation among educrats and reformsters has rested entirely on its well-polished marketing and not what it actually does in the field. So let's see what Collins found out, shall we?

The Subjects

Collins used an un-named district in the Southwest that is heavily invested in VAAS, has a strong union presences, and 11K teachers. By using a researchy randomizer and digging down to teachers who are actually directly evaluated by VAAS, the research ended up with 882 responders. The responders were mostly female, with a wide range of ages and ethnicities (the oldest was 78!) Collins speaks fluent researchese and if you want to evaluate the solid basis of the research, everything you need is there in the paper. We're just going to skip ahead to the civilian comprehensible parts.

Reliability

Collins first set out to see if, from the teacher perspective, VAAS results seemed reliable. The answer was... not so much. Teachers reported fluctuating from year to year. One teacher drew the gifted student short straw and so showed up on VAAS as a terrible teacher but "the School Improvement Officer observed my teaching and reported that my teaching did not reflect the downward spiral in the scores." The repeated story through the various responses was that a teacher's effectiveness was most directly related to the students in the classroom, except when scores fluctuated year-to-year for no apparent reason.

Validity

For a smaller percentage of teachers, the usual horror stories applied. About 10% reported that they'd been evaluated for scores for subjects in which they were not the teacher of record. Almost 20% reported being VAASed for students for whom they were not the teacher of record, including those like the student who arrived late in the year and left soon thereafter for alternative school. "I'm still considered the teacher of record even though he spent 5-6 months out of my classroom."

Over half of the teachers indicated that their VAASified scores did not match their principal evaluation. Most commonly the principal rated higher, but some teachers did report that VAAS gave them some help with bucking a principal with a personal beef against the teacher. At the same time, a large chunk of the teachers reported that they were getting an award of some sort for teacherly swellness at the same time VAAS was calling them stinky.

Formative Use

We are told repeatedly that VAAS info is formatively useful-- that peeking in there should help inform our remediation and help us fine-tune our instruction. In fact, that is what several of our regional college teacher prep programs teach aspiring teachers.

Well, baloney. 59% of the responders flat-out said they don't use VAAS for that, at all. One teacher noted that by the time the data is up, it's for students you don't teach any more, and to find data for the students you do have requires a long student-by-student search (feel free to work on that in your copious free time).

Of the teachers who said they do use VAAS to inform instruction, further questioning indicated that what they meant was "but not really."

The most common response was from teachers who responded that they knew they were “supposed to” look at their SAS EVAAS® reports, so they would look at the reports to get an overview on how the students performed; however, these same teachers called the reports “vague” and “unclear” and they were “not quite sure how to interpret” and use the data to inform instruction.

Even teachers who made actual use of the reports (commonly to do ability grouping) couldn't really explain how they did that. This puts them on a par, apparently, with many principals who reportedly shared VAAS scores with teachers "in a manner that was 'vague,' 'not in depth,' and 'not discussed thoroughly.' "

Does it deliver on its promises

SAS EVAAS makes plenty of promises about how it will revolutionize and awesometize your school district. Collins did a quick and simple check to see if teachers on the ground were seeing the marketing promises actually materialize. Here's the list of promises:

EVAAS helps create professional goals
EVAAS helps improve instruction
EVAAS will provide incentives for good practices
EVAAS ensures growth opportunities for very low achieving students
EVAAS ensures growth opportunities for students
EVAAS helps increase student learning
EVAAS helps you become a more effective teacher
Overall, the EVAAS is beneficial to my school
EVAAS reports are simple to use
Overall, the EVAAS is beneficial to me as a teacher
Overall, the EVAAS is beneficial to the district
EVAAS ensures growth opportunities for very high achieving students
EVAAS will identify excellence in teaching or leadership
EVAAS will validly identify and help to remove ineffective teachers
EVAAS will enhance the school environment
EVAAS will enhance working conditions

Collins just asked teachers whether they agreed or disagreed. The list here puts the items in ascending amount of disagreement, so the very first "professional goals" item is the one teachers disagreed with least-- and still more than 50% of the respondents disagreed. From there it was just downhill-- at the bottom of the list are items with which almost 80% of teachers disagreed.

Unintended consequences

Did teachers report any effects of VAAS that were not advertised? Yes, they did.

There was a disincentive to teach certain students. ELL students in a transition year and gifted students with their ceiling effect were both unloved. Given the choice, some teachers reported they would choose not to teach those students.

Teacher mobility-- moving from one grade level to another-- was also a casualty of the VAAS model, particularly in those schools that use looping (staying with one group of students for two or more years).

Gaming the system and teaching to the test. Angling for the best students (or having the worst packed into your classroom by an unfriendly principal) seem common. And, of course, as we all already know, the best way to get good test results is to drop all that other teaching and just teach to the test.

Numerous teachers reflected on their own questionable practices. As one English teacher
said, “When I figured out how to teach to the test, the scores went up.” A fifth grade teacher added,
“Anything based on a test can be ‘tricked.’ EVAAS leaves room for me to teach to the test and
appear successful.”


Distrust, competition and low morale also rose in these schools, where VAAS is linked to a "merit" system. Why share a good teaching technique if it's only going to hurt your own ranking? It is bad news for you if the teachers who are the feeders for your classroom do well-- their failure is the foundation of your own success. All of this is predictably bad for morale, and Collins's research supports that.

The incentive program is not an incentive. For something to be an incentive, you need to know what you have to do to get the incentive. All we know is that as a teacher you have to improve your scores more than the other teachers. You can make improvements each year, but if other teachers improve the same amount, you have made no gains according to the system. It is a constantly moving target. You don't know what you need to do to get the "prize" until after the "contest" is over.

Conclusion

SAS EVAAS® and other VAMs, by themselves, are sophisticated statistical models that purportedly provide diagnostic information about student academic growth, and represent teachers’ value-add. In other words, SAS EVAAS® and VAMs are tools. It is what teachers, schools, districts, and states do with this information that matters most. However, for the teachers in this study, even for those participating in training sessions on how to use the data, the SAS EVAAS® data alone were unclear and virtually unusable. For SSD, not only are teachers not using the “product” that costs the district half a million dollars per year, but teachers are aware that SAS EVAAS® inputs can be manipulated based on the student makeup of their classroom, and some teachers even confess to teaching to the test and cheating in attempt to increase their SAS EVAAS® scores.

Collins hasn't found anything that reasonable teachers haven't talked about and predicted for these models, but now we all have a real research paper we can link to for people who have to have those sorts of things for proof. The view from ground zero is clear-- the system is unreliable, invalid, unable to produce the results it promises, and all too capable of producing toxic effects.

It's true that this paper deals a great deal in sheer accumulation of anecdote. I'm struck by just how brutal all the findings are for VAM, because with this type of survey instrument I'm certain that the teacher tendency to be a good little soldier and give the answers you're supposed to give (look back at the formative question) and so a certain percentage of teachers are inclined to just say, "Why, yes! The emperor's new clothes are beautiful and grand," and then go back to the lounge and make comments about the emperor's shocking nakedness.

A teacher of my acquaintance took an on-line course that included some portion about the awesome usefulness of SAS-PVAAS; the teacher was reluctant to openly say how useless the site was for that teacher. When the awful statement was finally out there, many other teachers finally broke down and said, yeah, me too. Nobody tried to defend it. Too many times teachers stand by quietly while the house burns down because they don't want to be impolite or rock the boat.

So when I see research like this that brings forth a whole bunch of boat-rockers, my immediate suspicion is that this is only the tip of the iceberg. I've just hit the highlights here; I recommend you go read the whole thing and get the full picture. But once again, the challenge is to get people in power to actually listen to teachers. Maybe that will happen if it comes in the form of actual research. 

3 comments: