Courtesy of the Journal of Policy Analysis and Management (and not any kind of education related journal) comes a new piece of headline-generating baloney research built to make charter schools look good.
"Charter High Schools' Effects on Long-Term Attainment and Earnings" comes to us from researchers at Georgia State University, Vanderbilt University, and Mathematica Policy Research (always a reliable source of Gates-funded/friendly research), with funding from the Joyce Foundation, and appears to be a revisit of some earlier Mathematica research-shaped product. Their conclusion, coming soon to a headline near you, is that charters lead to more college attainment and more money.
How, you may wonder, can anybody actually research such a thing. After all, the big problem of any research on human stuff is finding a control. We can say that Chris went to a charter school, then went to college, then got a great job as VP of Widgetary Development at World Wide Widgets. But none of that tells us what would have happened to Chris if Chris had attended a public school instead. And unless we can find an exact doppleganger of Chris to follow along an alternate path, or a means of slipping into an alternate universe, we have no way of knowing. Which means we have no way of knowing.
The researchers acknowledge this by opening their methodology section with the phrase "Determining the impact of charter high schools is not easy..." which is true. It is also the last thing they will say in plain English throughout the entire methodological description. Seriously-- I just spent my lunch hour with Les Perelman's BABEL nonsense generator, and this seems ike it might have come from that same source.
Okay, maybe it's not that bad. They do write this:
The fact that the charter students and their parents actively seek an
alternative to traditional public schools suggests the students may be
more motivated or their parents may be more involved in their child's
education than are the families of traditional public school attendees.
They probably should have quit right there and called it a day. But they didn't. They tried to correct for selection bias, and here's some of what they had to say about that. Here they are rejecting one method:
Two recent studies (Furgeson et al., 2012; Tuttle et al., 2013)
have demonstrated that longitudinal analyses of test score impacts that
control for pretreatment test scores can closely replicate randomized
experimental impact estimates for the same students. But this approach
cannot be used to measure long-term outcomes such as graduation, college
enrollment, college persistence, and employment, because those outcomes
do not occur before a student's enrollment in a charter school.
They talk about how to generate a strong comparison group, which involves looking at charter eighth graders, because reasons. This creates some external validity problems, but in their opinion, the sacrifice is worth it for increased internal validity. They seem to think this is good because it catches students before the transition to high school. Does this not make sense yet? Well, this should settle it for you:
To further deal with potential endogeneity, we also use a matching approach popularized by Rubin (1977) and Rosenbaum and Rubin (1983).
While matching procedures can take many forms, we use a one-to-one
nearest-neighbor Mahalanobis matching approach (also referred to as a
covariate match) in which we match on observable characteristics to
create a control group. We then examine difference in student outcomes between those in treatment relative to this counterfactual control group.
Also, there is math.
The actual data used came from Florida, which covers both high school graduation and, for anyone who has unemployment insurance records, data about employment earnings. The research centered on four cohorts in eighth grade between 1998 and 2002. So this research should be very meaningful, because, really, not much has changed in education in Florida in the last 15-18 years, right?
I tried to answer that, but much of Florida's charter info only goes back ten-ish years. Florida's modern charter law was passed in 2002 (Jeb! Bush was governor from 1999-2007) replacing the first version from 1996. In the 1998-1999 school year, there were a total of 67 charter schools in Florida, and only 20 of those had an eighth grade. Total charter students-- 9,135. By 2001-2001, charters had ballooned to 176, with over 40K students. But still-- I'm wondering just how large a sample the researchers were able to pull out of that.
On top of that, charters were relatively small potatoes, which means that charter students would have been a not-at-all-average group. I'm not a statistician or scholarly researcher (nor do I play one on TV), but I can't escape the notion that the same kind of parental support and push and resources that would get a student into a charter school (particularly back then) would be the same kind of parental support and push and support that would get a student through high school and into college.
In other words, I am once again inclined to conclude that a lot of very fancy researchers and scholars do a lousy job of distinguishing between correlation and causation.
I will gladly accept input from anyone who is actually a trained statistical design scholar, but until someone I can trust tells me otherwise, I'm going to conclude that this is high-priced baloney served on a silver platter.