Tuesday, October 14, 2014

We're Here To Help

Andy Jacob, the "comms guy for @TNTP," disagreed with my characterization of TNTP's classic, "The Widget Effect." A few days ago, I summed up the message with two sentences:

We don't pay teachers differently based on how good they are. We should do that.

Jacob allowed as how he wasn't sure I'd really read the report, and when I assured him that I had, suggested a correction to the drift of my gist.


This "helpfulness" idea is one that turns up often in reformster teacher evaluation plans. "We're just dredging up all this data about your performance," they say, "so that we can help you become a better teacher." Sometimes they would also like to help school administrations make better staffing choices.

Now we could talk about all the subtle clues that "helping" might not be the goal here, such as the repeated complaints that if 70% of students failed the Big Test then it can't possibly be true that 90% of their teachers don't suck. But that's not where I'm headed today.

No, sometimes, when somebody makes a claim, I find it useful to perform a little thought experiment and imagine what the world would like if what they claimed was true was, in fact, true. What do I imagine a teacher evaluation system set up to help teachers would look like.

Wide slices.

The observation portion would not be a single short snapshot. Since this is an imaginary system to help teachers, I'm going to go ahead and imagine a system that would be a huge pain for administrators. I'll imaginarilly help them some other day.

So, more than one visit. And more than formal observations. Plenty of drop in's, drop by's, drive by's, and maybe even some lurking outside my room. Point is, I don't need help figuring out what needed to be tweaked in a single lesson on the third Tuesday of the second month of school. I'm more interested in finding out if there are systemic issues in my classroom, but those issues will only be obvious over time.

Student products.

Take a look at what kind of product I'm getting out of my students. Look at their written work-- does it look like they're achieving in that area. Look at the tests I've given them-- how are they doing on those. And take a longitudinal look-- are they making gains over the course of the year. Hell, go ahead and talk to them-- see if they think they're learning anything.

Yes, you can throw in some standardized test results if you like, but most of those crappy bubble tests designed in far away places by people who don't know me, my students, my community's expectations for education-- those tests have little of use to tell me. Basis for comparing me with some teacher in Palookahville, North Arksylvania? What good does it do me to know whether he prepared his students for a one-time pointless exercises better than I did? It's a one-time pointless exercise.

Handle the results professionally.

If, in the course of observing me, you saw me post grades with names attached on the board, announcing "Look at how many people are better students than Ashley," you would rightfully chastise me (if you would not, I do not want to work for you).

If this evaluation system is all about helping me, then it should be part of a conversation between you and me and nobody else. I do not help Ashley learn about gerund phrases by telling the whole class how much she sucks. I help Ashley by talking to Ashley. You help me by talking to me. It may be more efficient to hand me off to a designated peer coach, but if I'm going to get better, it will be helpful not to be distracted by fear and humiliation.

Support me.

Help me focus on all the ways I can succeed, not all the ways I'm going to suffer if I fail. When you learn to drive a car, you learn to keep your eyes on the road, to focus on the destination you desire. If you get afraid of the tree, and stare at the tree, you will drive directly for the trees.

So let's figure out a a plan for my success, not for my failure.

Don't raise the stakes.

The stakes are already huge. The stakes are my ability to think of myself as a successful teacher, as a person who is succeeding in the career I have chosen for my adult life. I have spent the last umpteen years working toward doing this for a lifetime, toward waking up in the morning and saying, "I am a teacher, and today I am going to make a difference in some child's life."

If I fail at this job, it will not be like the summer I decided I would never be a good telemarketer. I will be devastated.

So threats to cut my pay, hold up my raise, mark me with a scarlet "ineffective"-- these just add insult to injury. These allow me to snatch defeat from the jaws of victory, to say, "Yeah, I got better, but not better enough I guess."

I am already hugely motivated. You can't motivate me more-- you can only hurt me. And if I'm not already hugely motivated, you should probably counsel me out. But in the meantime, trying to mess with my pay and my ratings won't make me any more motivated.

If I'm good, recognize me.

Give me a little plaque. Give me some office space. Thanks me. Let me have a little piece of extra responsibility that I can handle on my own in my own way. Recognize me in the way that grown professional humans recognize each other. A performance bonus is-- well, it's not that I can't use the money, but it's kind of like taking a woman out for a date and at the end saying, "You were great honey, and handing her five twenties." It's not exactly flattering.

Stay with me.

Continue the process. Work with me. Check up on me. Help me move forward. Don't just drop an improvement plan on my desk and disappear like smoke. Work with me as if you actually wanted to keep me around.

These ar all things I would expect to see in a teacher evaluation system designed to help me. Help, not threats and punishment. Emphasis on collecting information that tells you how I'm actually doing at teaching (not how I'm doing at being in a room with students who generate good test scores). And no-- I don't mean "multiple measures," because many of the indicators and pictures that show what kind of teacher I am are not measures at all. Saying "multiple measures" for evaluating teachers is like saying we'll use lots of different rulers and tape measures to measure excitement. A whole variety of the wrong tools = the wrong tools.

And support to help me get to where we both want me to be. Those are the things I would expect to see. Do you see those things in refornster teacher eval plans? No, me neither. Not even with multiple measures.


Monday, October 13, 2014

Fordham and CCSS and Reading and Writing

Robert Pondiscio opens a recent Core defense over at the Fordham blog with a nice display of verbal fireworks. Maybe that's why I always end up responding to the Fordham boys; they may be dead wrong on many educational issues, but at least they can write. And as it turns out, writing is on the menu of Things We Think Are Swell About the Core in "What's Right About the Core."

Pondiscio believes there are a few (three, actually) big ideas worth "preserving and promoting." Let's see if we're really ready to gives these puppies a warm, loving home.

Reading To Learn

Pondiscio is a member of the Rich Content Club, and he rails against reading that is context-free, focused on decoding, and not connected to the process of using and acquiring broader knowledge. Confused yet? Pondiscio seems to be, because his rant leads us to this sentence:

By contrast, Common Core’s recognition that content matters — the more you know, the more you can read with understanding—is an important recognition of how reading comprehension actually works.

The Rich Content Club is bound and determined to believe that Common Core is on their side, when it is clearly, explicitly, not. David Coleman has repeatedly made it clear-- Common Core reading is Close Reading, and Close Reading is done "between the four corners of the text." It is also supposed to be done only with short texts (kind of like the ones you'd find on a standardized test).

So the Rich Content Club's belief that full, deep texts that allow students to draw on and add to a broad background of knowledge-- well, David Coleman disagrees with them way more than I do. Content rich reading to learn is not a thing to preserve and promote about the Core any more than beautiful flowers in rolling green fields are the best part of winter in Minnesota.

Curriculum Matters

Pondiscio toots the Ed Hirsch horn here, and again I ask-- what on earth does Hirsch's knowledge-rich cultural literacy have to do with Common Core (which only addresses two content areas in the first place)?

Pondiscio is explicitly not a fan of the fuzzy-headed progressive child-centered hippy-dippy baloney, and I am not inclined to argue with him a great deal about that. But many members of the Rich Content Club seem to think that since CCSS promises to be Not That Fuzzy Thing, it must ipso facto be the kind of rigorous, tough-minded content-rich knowledge-pumping engine of cultural literacy that they dream of. It isn't. It just isn't. There isn't a word of the standards that would make me believe for a moment that it is what they hope for, and the the CCSS-linked high stakes tests clearly and explicitly push in another direction entirely.

It's like listening to one of my sixteen-year-old students explain that her cheating, lying boyfriend is really a wonderful guy, but people just don't know him like I do. Yes, we do, honey, and he does love you like you love him.

Show What You Know

Pondiscio is certain that writing is in "appalling shape" in schools. He's upset that fuzzy child-centered stuff is still leaving a mark, somewhere, I guess. That particular wave never rolled in very high or heavy in these parts, but my aunt ran an open school in Connecticut in the sixties, so I've seen it go both ways.

He shares Coleman's disdain for the personal essay. I'd offer one observation for him, as a teacher of writing-- personal essays allow students to write about subjects on which they are experts. A personal essay lets students focus on craft and technique without having to also worry about researching and supporting content. I would never do a whole year of them, but I'd never do a whole year without them, either. They serve a purpose.

He has this complaint. "If kids enjoy writing, the theory goes, they’ll write more." Well, yes. Duh. I don't know what he's offended by-- students enjoying schoolwork? Back in my day, we hated school, uphill, both ways, in the snow. Made real men out of us (even the girls). Seriously, Pondiscio is starting to go full schoolmarm here.

Pondiscio takes a shot at developing "voice" over structure or grammar (although I suspect he means usage, not grammar-- common mistake). It's an odd criticism from a man who opened this particular essay with about three paragraphs worth of raw, unadulterated voiciness. In fact, there isn't a successful writer I can think of who doesn't have a strong, distinctive voice.

He pulls things back a bit and admits there's a place for personal expression, but the pendulum has swung too far back, and it's about time it swung back, young man.

This is all a conversation worth having. I just want to offer one observation-- there is not the slightest need for Common Core in order to have this discussion. On writing, CCSS offers standards that are so weirdly bad that they will never affect anything anyway ("Never mind. We'll just see what the test wants and cover that separately," say thousands of writing teachers). So let's have the writing pedagogy discussion (for English teachers, it is the conversation that never, ever ends). But don't tell me that it has anything to do with Common Core.

That's Three

So Pondiscio has pulled up three highly debatable topics, but what is most debatable about them is whether or not they have anything at all to do with the Common Core. But that's turning out to be yet another reason that CCSS should just go away-- they are becoming an odd distraction in the midst of discussions of more worthwhile topics. They have nothing useful to add to the conversation, and if we're not careful we get sucked into arguing about what they say about instruction instead of, as we should be, arguing about what good instruction looks like.


Pondiscio ends with an imploration not to throw out the baby with the bathwater, but I think we're really talking about not throwing the baby with that old bicycle out behind the barn. They don't have anything to do with each other. Though Andy Smarick also ended his piece in praise of testing, published the same day as Pondiscio's column, with the same baby-bathwater image. Kind of makes me wonder if the Fordham Foundation is expecting.

Super Quote Re: Public Vs. Private

Bruce Dixon at Black Agenda Report way back in February of 2013  a must-read article about privatization under the current administration.  Diane Ravitch quoted it earlier today, but I need to set it down, too, because this quote deserves to be handily located in everyone's mental file of Responses To The Same Old Reformster Arguments. So the next time somebody tries to tell you that the new wave of charter school chains are public schools, just tell them this:

On every level, the advocates of educational privatization strive to avoid using the p-word. They deliberately mislabel charter schools, just as unaccountable as every other private business in the land as “public charter schools,” because after all, they use public money. So do Boeing, Lockheed, General Dynamics, Bank of America and Goldman Sachs, but nobody calls these “public aerospace companies,” “public military contractors,” or “public banks.”

You get to call yourself a public institution when you are answerable to the public (say, by having your governing board members stand for election). You get to call yourself a public institution when any taxpayer who's paying for your shop to stay open can have full and transparent access to your financial information.Some charters, particularly the traditional ones, do this, and they deserve the "public" label.

But if your attitude is "Once that money is in our hands, it's our money and we don't have to explain anything to anybody," you are not a public institution. When it comes to "public," charter chains keep using that word, but I do not think it means what they think it means.

Public Service Incentives (Short Form)

I don't usually do this, but I just hammered out a large post. I'm now to attempt a short version. If you find it intriguing, follow the link to the long version.

Some reformsters keep trying to shore up teacher evaluation systems by referring to the private sector where, we are told, folks are rewarded for doing a good job and suffer for doing poorly. Why, they ask, should teaching not come with a performance based incentive system?

I believe that's a false comparison. A really false comparison.

Teachers are not private sector employees. They are public servants, and public service does not play well with performance incentives.

Nobody proposes that we make performance incentives for police officers that reward them for higher numbers of arrests.

Nobody proposes that we create fire fighter performance incentives based on the total dollar amount of property they save in a month.

Nobody proposes that mail delivery persons be rewarded for delivering to a certain number of addresses per hour (if you can't see why not, think about Montana instead of New York City).

Nobody proposes that we give national guardsmen bonuses based on the income of the people whose property they protect.

Nobody proposes these things because public service means serving the whole public, all of them, equally, without prejudice, without ignoring a part of the public because it might hurt your numbers or screw up your monthly performance bonus.

I stretch it out and fill in the gaps in the longer post, but that's the nut of it.

How We Pay Public Servants-- and Why

We have always paid public servants a flat fee, untethered to any sort of "performance measures." That's because we want public service to be completely disconnected from any private interests. (And if you just thought, "Damn, this is a long post," you can get the basic point here and decide if you want to travel down the whole web of alleys with me.)

Fighting Fire with Money

Imagine if, for instance, we paid fire fighters on sliding scale, based on how many of which type fires they put out at a certain speed. This would be disastrous for many reasons.

Fire fighters would refuse to work in cities where there were few fires to fight, because they couldn't make a living. In cities where there were commonly multiple fires, fire fighters would look at each fire call through a lens of "What's in it for me?"

For instance, in a system where fire fighters were paid based on the value of the flame-besieged property, fire fighters might view some small building fires as Not Worth the Trouble. Why bother traveling to the other side of the tracks? It's only a hundred-dollar blaze, anyway. Let's wait till something breaks out up in the million-dollar neighborhood.

In the worst-case scenario, one of our fire fighters depending on performance-based pay to feed his family may be tempted to grab some matches and go fire up some business.

Perverse Incentivization

Occasionally we've seen these kinds of perverse incentives in action, and we don't much like it. The areas of the country where you take extra speed limit care at the end of the month because the local police have a quota to meet. The neighborhood where cops have to roust a certain number of suspects a week to keep their job ratings. Nobody thinks these are examples of excellence in public service.

In fact, we have a history of playing with private police forces and private fire companies. We don't much care for how that works out, because it creates a system that provides excellent service-- but only for the customers who are paying for it.

The idea of public service is to create a class of people who are above self-interest and who do not respond to a single boss. We are outraged when abuse of police power happens precisely because we expect the police to act as if they work for everyone, and to put their dedication to that service above any single interests, including their own.

That's the definition of public service-- service roles that are stripped of any possibility of incentives other than the mandate to serve the public good. That's what we mean by "professional"-- a person who puts all personal self-interest aside and focuses on Getting the Job Done. Trying to motivate a public servant with self-interest inevitably tends to pollute the professional setting with the very self-interest that we're trying to get out of there.

Incentives and Suck-ups

Here's the thing about performance incentives. They always come from actual individual humans. In business that's okay, because the humans are already the bosses.

In public service, we often talk about performance incentives as if they fall from the sky, descending fully-formed from some on-high objective source. They are not. They, too, are developed by actual individual humans. And those humans will invariably encode their own values and priorities into the incentives.

"I like red houses. I think they are more valuable," say our fire company evaluators. "That's why I live in one. And that's why a good fire company always gives priority to saving red houses first."

Performance incentives for public service always-- always-- involve substituting the values of the few for the values of everybody. Fire fighters are supposed to save save everybody's homes. Police are supposed to protect every citizen. The US Postal Service is supposed to deliver to every home. They are there to serve the public, and that means everybody. Public servants are supposed to support the values of all citizens. Any performance based evaluation reward system will prioritize some citizens' values over those of other citizens.

A Public Service Performance Based Incentive System In Action

You know which public servants have a fully-realized performance-based pay system in place?

Congress.

The CEO of International Whoomdinglers says, "That Senator Bogswaller has done an excellent job of looking out for the things we believe are important. He ought to keep his job. Send him a big fat check."

The head of the Society for Preservation of Free-Range Spongemonkeys says, "We appreciate the hard work that Representative Whangdoodle has done looking out for spongemonkeys. He deserves a raise. Send him a big fat check."

You (and the members of the Supreme Court who are paying attention) might call this corruption, but it's just a Performance Based Incentive System, and the high regard with which Congress is held tells you how well a PBIS mixes with public service.

But But But

But a Performance Based Incentive System put in place by the government would not be run like the hodge-podge of private interests you describe incentivizing the US Congress, you say.

And I say, baloney. We already have a Performance Based Incentive System that says you're a better school district if you sell more of the College Board's AP product line. The PBIS testing system being used to incentivize students and teachers and schools-- that system is entirely a product of private corporate interests.

The only difference between an private incentive system, like the one that runs Congress, and a public one, like Race to the Top, is whether the people with money and power have to manipulate a government middle man or can go straight to the source.

Under the Umbrella

I teach mostly juniors, sometimes seniors. There are a few things I tell them every year.

One is to make the most out of senior year, because it is the last time they will be surrounded by people who are paid to put the students' interests first. It's the last time they'll be in an institution that is organized around their concerns, their interests, their needs. After that, they're in the open market. They will always be dealing with people who are trying to sell them something.

The PSAT will collect a ton of information about them so it can turn around and sell that data to colleges. Colleges will try to sell them, particularly if they are highly desirable customers students. Employers will try to get the use of their talents without having to pay much for it, and politicians will piss on them and tell them it's raining so that the pols can keep their jobs.   

But here, under the umbrella of the public school, my students have nothing that I need to survive or make a living. I have no reason to do this job except for the reason I took this job in the first place-- to serve the best interests of my students.

It Doesn't Make Any Difference

It makes business-oriented reformy types crazy that the way I do my job doesn't make any difference to my pay. I understand the terror for them there, but that Not Making A Difference is actually the point of how we pay public servants.

It doesn't matter it's a big fire or a small fire, a rich person's house or a poor person's house-- the fire department still does their job. It doesn't matter whether I have a classroom full of bright students or slow students, rich students or poor students, ambitious students or lazy students-- I will still show up and do my job the best I know how. I should never, ever, ever have to look at a class roster or a set of test results or a practice quiz and think, "Dammit, these kids are going to keep me from making my house payment next month."

Why I Won't Suck

Reformsters are sure that human beings must be motivated by threats and rewards, and that the lack of threats and rewards means that I can too easily choose to do a crappy job, because it won't make any difference. They are wrong. Here's why.

1) I knew the gig when I started. I knew I would not get rich, not be powerful, not have a chance to rise to some position of prominence. There was no reason to enter teaching in the first except a desire to do right by the students.

2) Teaching is too hard to do half-assed. Do a consistently lousy job, and the students will eat you alive and dragging yourself out of bed every day will be too damn much. There isn't enough money to keep people flailing badly in a classroom for a lifetime. Just ask all the TFA dropouts who said, "Damn! This is hella hard!" and left the classroom.


And Most Importantly

Threats and rewards do not make people better public servants (nor have I ever seen a lick of research that suggests otherwise, but feel free to review this oft-linked video re: motivation). Threats and rewards interfere with people's ability to get their job done. Threats and rewards motivate people to game the system.

And any time you have a complex system being measured with simple instruments, you have a system that is ripe for gaming. In fact, if your measures are bad enough (looking at you, high stakes tests and VAM), your system can only be successfully operated by gaming it.

So, No Accountability At All?

Heck, no. You need to keep an eye out for the grossly incompetent, though they will often self-identify (I've made a huge mistake) and take the next stage out of Dodge. Beyond that, you just go watch and pay attention. If you're my administration, you're welcome in my classroom at any time for as long as you'd like to stay. No, don't bring that stupid checklist. Just watch and listen and use your professional judgment. If you think I need to fix something, let's you and I talk about it. How will bringing in extra layers of bureaucracy and government make that system work any more smoothly?

Man, This Is Running Long

Agreed. Sometimes these posts get away from me.

Bottom line-- the comparison to private enterprise performance based incentive systems is bogus. Those systems may be appropriate in corporate environments where we want to enforce a bias in favor of certain actors and outcomes-- where some people are in fact more important than others.

But in public service, performance based incentive systems are contra-indicated. They by nature enforce a particular bias and cannot help but tilt the system in favor of some customers over others.

The system we have does, in fact, make sense. We stand our public servants beside a door and we say, "I'm going to pay you to stand here and wait as long as it takes and help whoever comes through that door. It doesn't matter who comes through that door-- nothing is going to affect your pay, so that's a settled and done deal-- just concentrate on watching that door and helping whoever walks through it."

Are You Still Here?

God bless you. This chunk of ideas has only just grabbed ahold of me, and I still have much to sort out, which I will undoubtedly do in future posts. Feel free to chime in in the comments.

The Teacher Jobs Shortfall

Last week the Economic Policy Institute popped out an article attached to this graphic:






Their headline was the dramatic shortfall of jobs in public ed, particularly when compared to their projection of the number of jobs needed to keep up with enrollment.

I have a number of questions raised by the graph.

Does this reflect a shift to private employers? How many public school jobs are being shifted to private and charter schools? We know about the dramatic shifts of this nature occurring in places like New Orleans and Cleveland, where pubic schools have been shut down so that they can be replaced with charters. But is that process occurring in smaller ways throughout the country-- Bogwump High School cuts three teachers, but Bogwump Corporate Charter opens with a six teacher staff. And that raises its own second question-- is the EPI accepting the fiction that charter schools are public schools, or is it correctly counting charter teaching jobs as non public school jobs.

Public education added 400,000 jobs between 2003 and 2008?!?! Okay, not exactly a question, but-- damn. Did everyone go on a staff spending spree with their stimulus money? Did regulations and court decisions create a ballooning demand for special needs teachers? I mean, I was going to look for data to indicate whether that's abnormal growth or not, but I don't really need to. If that were normal growth, that would mean twenty yeas ago there were no teaching jobs at all, and I'm pretty sure that's wrong.

But that raises the real question-- are we looking at a shortfall, or a market correction after an employment bubble?

What is that projection based on? EPI says we should have added 123,000 jobs over the 2008 numbers (or 377,000 over where we are now) in order to keep pace with student growth. I'm curious about exactly what formula was used top determine that number.

Where is the understaffing happening? If this chart and the EPI interpretation of it are correct, we have schools that are grossly understaffed, overworked, and financially strapped. Everything I know tells me that is true-- but not everywhere. My own school, for instance, has shed many staff positions in the last ten years, but we have also dropped student population considerably. The same story is true in most districts in my region.

So has somebody done a more specific regional study? Do we know exactly where local public schools are failing to keep staffing rates up to speed with student population numbers? This is definitely more of a problem in some places than in others-- do we know where those places are? And do we have some data beyond the anecdotal?

The Great Recession was hard on teaching as a profession, and it raised the curtain on reformster policies that were also hard on the teaching profession, particularly in public schools. Given the number of people who have been chased out of the profession, this chart is but half as picture, and too broad of one to be really useful. The profession is hemorrhaging teachers, particularly teachers of color and young teachers, and where it is not hemorrhaging them, it is chasing them away vigorously by making the profession as unattractive as possible (looking at you, North Carolina).

Are we shedding jobs to keep up with shedding teachers? Are we chasing teachers away by shedding jobs? And most importantly-- how do these dynamics play out differently from place to place?

Ultimately this chart raises many questions for me. If you have the answers, let me know. If I find any answers, I'll pass them along.

Sunday, October 12, 2014

What We Haven't Learned from the Widget Effect

Do you remember that awesome post I wrote that totally changed the face of American education?? You don't?? Well, let me just keep mentioning that awesome post (and how it changed the face of American education) for the next five years and maybe my massive importance will start to sink in.

That's about where we are with TNTP and "The Widget Effect," a "report" I'm not going to link to for the same reason I don't mention TNTP's leader by name or provide links to pro-anorexia sites-- some things are just already taking up too much of the internet.

The Widget Effect is celebrating its fifth anniversary of its own importance. If you're unfamiliar with the "report," let me summarize it for you:

We don't pay teachers differently based on how good they are. We should do that.

That's it. Pump it up with extra verbage and slap on some high-fallutin' graphics, and you've got a "report" that other "report" writers love when they need to add some gravitas to the footnote section of their "report." As you may have heard, there's particular interest in the "We should do that" portion; TNTP is a huge fan of teacher evaluating.

TNTP has presented several anniversary evaluation commentary-paloozas, including this one that sandwiches a thoughtful Andy Smarick piece in between two large slabs of reformy baloney. But that's not where we're headed today. Today we're going to look at "4 Things We've Learned Since the Widget Effect." Let's do a little check for understanding and see if our five years of study have paid off.

Implementation Matters More Than Design

Correct! Reformsters have learned (and are still learning) that if you promise people a warm, cuddly pet and then drop an angry badger into their home, they lose interest in your promises very quickly. Further, you do not provide useful damage control by repeating, "But it's really intended to be warm and cuddly" while the badger has the children cornered and terrified on top of the credenza. Teacher evaluation has had teachers on top of the credenza for about five years, so happy anniversary, honey badger!

TNTP offers a solution best summarized as "Do it better." Sigh. In more words, the recommendation is that if you train your key people and give them time to do a better job, the badgers will be warmer and cuddlier. TNTP describes these key people with words like "Chief Academic Officers" and  "middle managers." The odd terminology leads us back to a central question-- does TNTP think the badgers are warm and cuddly, or does it just want to convince us so we'll let the badgers trash the house. I won't rule out the former, but I lean toward the latter.

Multiple Measures-- Including Data about Student Learning Growth-- Are the Way To Go

The old observation technique was a bust, TNTP says. They support this by saying that it's just common sense. So there ya go.

While the issue of evaluation remains hotly debated, multiple measures might be the one place where something resembling a consensus has emerged. That’s a positive thing we should celebrate.

Really? Which consensus would that be? There's a fairly large consensus that "including data about students learning growth" (aka VAM) is problematic because every instrument we have that claims to do it is no more reliable than having the badgers read tea leaves through a crystal ball. I'm guessing that's not the consensus being referenced.

So incorrect on the main answer. Their recommendation, however, is to have multiple observations by multiple observers. In buildings with enough administrative staff to implement it, that idea is... not stupid.

You Can't Fix Observations If Observers Don't Rate Accurately

Observations are also one of the best examples of the gap between design and implementation. If you’re concerned about the potential variability of value-added scores, you should be truly frightened by the statistical Wild West that is classroom observations. 

They're onto something here. Here's the thing about administrators-- if they are even remotely competent, they know how good their teachers are. They'll use the fancy piece of paper if you make them, but if the observation instrument tells them one thing and their brain, sense, and professional judgment tell them another, guess who wins. If you ask, "What are you going to believe-- the observation form or your own eyes?" They will go with their own senses.

Now, if your principal is a boob, or hates you for some reason, this effect is Very Bad News. Maybe you call that the statistical Wild West, but that's still better than VAM, which is a statistical black hole caught in a box with Schroedinger's cat strapped into the hold of the Andrea Doria sailing through the Sargasso Sea as it falls into the Negative Zone just as the Genesis Bomb goes off.

TNTP's solution-- easier, shorter paperwork. Because reducing a complicated human observation of complex human interactions to a short, simple checklist totally works. I suggest that TNTP staffers field test the principle by piloting a spousal observation form to be tested on evaluating their wives and husbands.

Double fail on this item.

Done Right, Teacher Evaluations Really Can Help Teachers and Students

We're going to go to the research connected to the IMPACT evaluation system in DC. And damn-- these people can't really be that confused or dopey, can they? I want to believe that they are willfully manipulative and misleading, because that would at least mean they're smart enough to understand what they're saying, and as a teacher, it makes me sad to imagine a lump of dumb this large in the world.

Okay, here's the deal. They measure a teacher's awesomeness. They give the teacher feedback on the measurement. They measure again, and the teacher proves to be more awesome. Let me see if I can illustrated why this proves almost nothing.

Chris: If you pass my test for being my awesomest friend, I will give you a dollar. Now, hold up some fingers?

Pat: Okay. How'd I do?

Chris: Bummer. If you had held up four fingers instead of three, I would have known you were my awesomest friend.

[Fifteen minutes later]

Chris: Okay, let's take the awesome friend test again. Hold up some fingers.

Pat: Cool.

Chris: You did it. Four fingers!! Here's a dollar!

[Later over supper at Chris's house]

Chris: Mom, Pat and I became much better friends this afternoon!

The IMPACT system and the attendant research are not useless. They prove that teachers can be trained to respond to certain stimuli as easily as lab rats. They do not, however, prove jack or squat about how the system "improves" teaching-- only that it improves teacher response to the system.

TNTP recommends staying the course. I recommend that TNTP release a dozen honey badgers into their offices and hold some special training meetings on top of the credenza. If the credenza is all covered up with the Widget Effect's birthday cake, just feed the cake to the badgers. Tell them they're celebrating one of the most influential reports of the last five years.