The holidays are over, life is back to normal(ish), and your classroom has hit that post-holiday stride. It is time to finally make your voice heard on the subject of teacher preparation programs.
As you've likely heard, the USED would like to start evaluating all colleges, but they would particularly like to evaluate teacher preparation programs. And they have some exceptionally dreadful ideas about how to do it.
Under proposed § 612.4(b)(1), beginning in April, 2019 and annually
thereafter, each State would be required to report how it has made
meaningful differentiations of teacher preparation program performance
using at least four performance levels: “low-performing,” “at-risk,”
“effective,” and “exceptional” that are based on the indicators in
proposed § 612.5 including, in significant part, employment outcomes for
high-need schools and student learning outcomes.
And just to be clear, here's a quick summary from 612.5
Under proposed § 612.5, in determining the performance of each teacher
preparation program, each State (except for insular areas identified in
proposed § 612.5(c)) would need to use student learning outcomes,
employment outcomes, survey outcomes, and the program characteristics
described above as its indicators of academic content knowledge and
teaching skills of the program's
new teachers or recent graduates. In addition, the State could use
other indicators of its choosing, provided the State uses a consistent
approach for all of its teacher preparation programs and these other
indicators are predictive of a teacher's effect on student performance.
Yes, we are proposing to evaluate teacher prep programs based on the VAM scores of their graduates. Despite the fact that compelling evidence and arguments keep piling up to suggest that VAM is not a valid measure of teacher effectiveness, we're going to take it a step further and create a great chain of fuzzy thinking to assert that when Little Pat gets a bad grade on the PARCC, that is ultimately the fault of the college that granted Little Pat's teacher a degree.
Yes, it's bizarre and stupid. But that has been noted at length throughout the bloggosphere plenty. Right now is not the time to complain about it on your facebook page.
Now is the time to speak up to the USED.
The comment period for this document ends on February 2. All you have to do is go to the site, click on the link for submitting a formal comment, and do so. This is a rare instance in which speaking up to the people in power is as easy as using the same device you're using to read there words.
Will they pay any attention? Who knows. I'm not inclined to think so, but how can I sit silently when I've been given such a simple opportunity for speaking up? Maybe the damn thing will be adopted anyway, but when that day comes, I don't want to be sitting here saying that I never spoke up except to huff and puff on my blog.
I just gave you a two-paragraph link so you can't miss it. If you're not sure what to say, here are some points to bring up-
The National Association of Secondary School Principals has stated its intention to adopt a document stating clearly that they believe that VAM has no use as an evaluation tool for teachers.
The American Statistical Association has stated clearly that test-based measures are a poor tool for measuring teacher effectiveness.
A peer-reviewed study published by the American Education Research Association and funded by the Gates Foundation determined that “Value-Added Performance Measures Do Not Reflect the Content or Quality of Teachers’ Instruction.”
You can scan the posts of the blog Vamboozled, the best one-stop shop for VAM debunking on the internet for other material. Or you can simply ask a college education department can possibly be held accountable for the test scores of K-12 students.
But write something. It's not very often that we get to speak our minds to the Department of Education, and we can't accuse them ignoring us if we never speak in the first place.