This week Education Next ran an article entitled "The First Hard Evidence on Virtual Education." It turns out that the only word in that title which comes close to being accurate is "first" (more about that shortly). What actually runs in the article is a remarkable stretch by anybody's standards.
The study is a 'working paper" by Guido Schwert of the University of Konstanz (it's German, and legit) and Matt Chingos of Brooking (motto "Just Because We're Economists, That Doesn't Mean We Can't Act Like Education Experts"). It looks at students in the Florida Virtual School, the largest cyber-school system in Florida (how it got to be that way, and whether or not it's good, is a question for another day because it has nothing to do with the matter at hand). What we're really interested in here is how far we can lower the bar for what deserves to be reported.
The researchers report two findings. The first is that when students can take on-line AP courses that aren't offered at their brick and mortal schools, some of them will do so. I know. Quelle suprise! But wait-- we can lower the bar further!
Second finding? The researchers checked out English and Algebra I test scores for the cyber-schoolers and determined that their tenth grade test results for those subjects were about the same as brick-and-mortar students. Author Martin West adds "or perhaps a bit better" but come on-- if you could say "better" you would have. This is just damning with faint praise-by-weasel-words.
West also characterizes this finding "as the first credible evidence on the effects of online courses on student achievement in K-12 schools" and you know what? It's not. First, you're talking about testing a thin slice of tenth graders. Second, and more hugely, the study did not look at student achievement. It looked at student standardized test scores in two subjects.
I know I've said this before. I'm going to keep saying this just as often as reformsters keep trying to peddle the false assertion used to launch a thousand reformy dinghies.
"Standardized test scores" are not the same thing as "student achievement."
"Standardized test scores" are not the same thing as "student achievement."
When you write "the mugwump program clearly increases student achievement" when you mean "the mugwump program raised some test scores in year X," you are deliberately obscuring the truth. When you write "teachers should be judged by their ability to improve student achievement" when you mean "teachers should be judged by students' standardized test scores," you are saying something that is at best disingenuous, and perhaps a bit of a flat out lie.
But wait-- there's less. In fact, there's so much less that even West has to admit it, though he shares that only with diligent readers who stick around to the next-to-last paragraph.
The study is based on data from 2008-2009. Yes, I typed that correctly. West acknowledges that there may be a bit of an "early adopter syndrome" in play here, and that things might have changed a tad over the past five years, so that then conditions under which this perhaps a bit useless data was generated are completely unlike those currently in play. (Quick-- what operating system were you using in 2008? And what did your smartphone look like?)
Could we possibly reveal this research to be less useful? Why, yes-- yes, we could. In the last sentence of that penultimate graf, West admits "And, of course, the study is also not a randomized experiment, the gold standard in education research." By "gold standard," of course, we mean "valid in any meaningful way."
So there you have it. Education Next has rocked the world with an account of research on six-year-old data that, if it proves anything at all, proves that you can do passable test prep on a computer. And that is how we lower the bar all the way to the floor.
Showing posts with label Education Next. Show all posts
Showing posts with label Education Next. Show all posts
Sunday, September 14, 2014
Thursday, August 21, 2014
More Bad Polling News For CCSS
While we're making note of how Common Core is tanking in the Education Next and PDK/Gallup polls, let's pull out one other poll from earlier in the summer. This one also used the word "plummets," which has become a serious contender for leading the Common Core Headline Word Bank.
Conducted and released in June of 2014, the Rasmussen Reports national phone survey checked the support for the Core among a very specific population-- those with children in elementary or secondary school.
Once again, we can see the result of a year's worth of direct exposure. In November of 2013, the Core was supported by an unimpressive 52% and specifically opposed by 32%. By the following June, the numbers had shifted. Among parents of school-age children, support dropped to 34%, while actual opposition to the Core (which the survey referred to as the Common Core national standards) had grown to 47%.
The message is the same as revealed in the other polls currently making PR use of the word "plummet"-- direct experience of the Common Core and the various barnicular educational attachments that come with does not make people love it better.
This poll is not news, but back in June, we couldn't see so clearly that it was the harbinger of a trend. This is the opposite of a grass roots movement, the reverse of going viral. This is like the movie that opens strong on Thursday and plays to empty theaters on Friday. Common Core's one big remaining hope was that people might experience it and say, at the very least, "Well, this wasn't so bad. I don't know why people were fussing." Instead, the reaction is more along the lines of "Damn, that really does suck."
Conducted and released in June of 2014, the Rasmussen Reports national phone survey checked the support for the Core among a very specific population-- those with children in elementary or secondary school.
Once again, we can see the result of a year's worth of direct exposure. In November of 2013, the Core was supported by an unimpressive 52% and specifically opposed by 32%. By the following June, the numbers had shifted. Among parents of school-age children, support dropped to 34%, while actual opposition to the Core (which the survey referred to as the Common Core national standards) had grown to 47%.
The message is the same as revealed in the other polls currently making PR use of the word "plummet"-- direct experience of the Common Core and the various barnicular educational attachments that come with does not make people love it better.
This poll is not news, but back in June, we couldn't see so clearly that it was the harbinger of a trend. This is the opposite of a grass roots movement, the reverse of going viral. This is like the movie that opens strong on Thursday and plays to empty theaters on Friday. Common Core's one big remaining hope was that people might experience it and say, at the very least, "Well, this wasn't so bad. I don't know why people were fussing." Instead, the reaction is more along the lines of "Damn, that really does suck."
Subscribe to:
Posts (Atom)