DeAngelis we've met before. He's a Fellow for the Cato Institute, policy adviser for the Heartland Institute, and a Distinguished Working-on-his-PhD Fellow at the University of Arkansas, all of this built on a foundation of a BBA (2012) and MA (2015) in economics from the University of Texas in San Antonio (because nobody understands education like economists). And while plugging away on that Masters, he worked first as the Risk Management Operations Coordinator and then the Fraud Coordinator for Kohl's. Patrick Wolf has several degrees in political science and has worked as an academic for most of his career.
And then they showed me directions to the unicorn farm. |
This particular paper comes out of something called the School Choice Demonstration Project, which studies the effects of school choice.
A Good Investment: The Updated Productivity of Public Charter Schools in Eight U.S. Cities pretends to measure school productivity, focusing on eight cities- Houston, San Antonio, New York City, Washington DC, Atlanta, Indianapolis, Boston, and Denver. In fact, the paper actually uses the corporate term ROI-- return on investment.
We could dig down to the details here, look at details of methodology, break down the eight cities, examine the grade levels represented, consider their use of Investopedia for a definition of ROI. But that's not really necessary, because they use two methods for computing ROI-- one is rather ridiculous, and the other is exceptionally ridiculous.
Method One: Ridiculous
The one thing you can say for this method of computing ROI is that it's simple. Here's the formula, plucked directly from their paper so that you won't think I'm making up crazy shit:
The achievement scores here are the results from the NAEP reading and math, and I suppose we could say that's better than the PARCC or state-bought Big Standardized Test, but it really doesn't matter because the whole idea is nuts.
It assumes that the only return we should look for on an investment in schools is an NAEP score. Is that a good assumption? When someone says, "I want my education tax dollars to be well spent," do we understand them to mean that they want to see high standardized test scores-- and nothing else?? Bot even a measure of students improving on that test. The paper literally breaks this down into NAEP points per $1,000. Is that the whole point of a school?
We can further see the ridiculousness of this by taking the next step-- if I want to make my school more cost-effective, as defined by this paper, what could I do? Well, I could cut every expense that isn't directly involved in preparing students to take the standardized math and reading test-- programs, staff, teachers, the works. And I would make sure that my school was filled with students who are good test-takers, with a minimum of ELL students and students with special needs.
Is it any wonder that this paper finds charter schools more "cost-effective" than public schools-- the "more bang for the buck" that the Post praised?
Method Two: More Ridiculous
Since ROI really should focus on the amount of money you get out compared to what you put in, the authors decided to take this exercise one step further.
To monetize this measure, we convert the average learning gains produced by each public school sector to the economic return of lifetime earnings.
The income return to investment is the net present value of additional lifetime earnings accrued through higher cognitive ability as measured by test scores.
Does the standardized math and reading test measure cognitive ability? And if you get your score to go up, does that mean your cognitive ability goes up, too? And most of all, what magical piece of unicorn-fueled research tells us that higher test scores lead to more income over a lifetime? Well, if you've been at this for a while, you know one of two names is about to appear, and sure enough...
Stanford University economist Eric Hanushek has estimated that a one standard deviation increase in cognitive ability leads to a 13 percent increase in lifetime earnings.
"Estimated" would be the key word here, because this whole mini-field of research has yet to produce convincing evidence since the OG of predictive standardized test economics, Raj Chetty, first started this gravy-soaked baloney train. The tortured methods used here to show how much money students will benefit from the test scores is inspired baloney. I show it here for your edification:
Only 70 percent of gains in learning persist each year. If we multiply these two estimates together, we find the learning gains relative to the average worker in the state. by comparing the learning gains relative to the average worker in the state, we estimate the returns to the schooling investment in terms of yearly income while accounting for contextual features of the local markets. We use 2017 data from the United states bureau of labor statistics to find state-level average annual earnings and assume that current students will work for 46 years between the ages of 25 and 70. When calculating the net present value of lifetime earnings, we assume a one percent yearly growth in average salaries and a three percent annual discount rate.
They use learning effect figures from a study conducted by the pro-charter CREDO that I cannot access on line. They assume that a student will spend thirteen years in a charter, though many charters do not offer all thirteen years. And they assume they have a legit formula for computing dollars of future earnings based on standardized test scores.
What else could be wrong? Argument from unexpected quarter.
Atlanta charters score high in this study because Atlanta has a big old cyber school, and if a cyber school is funded at a sensible level rather than the full level of a bricks and mortar school (as is the case in Atlanta) that makes them super-efficient. Except, of course, that study after study shows that virtual schools do a terrible job of actually educating students. But hey-- they're efficient.
But the efficiency study, particularly the second portion, suffers from one other major issue. There is likely to be a correlation between high test scores and later success in life, because both of those correlate heavily with socio-economic status of the family of the student. The real question is-- if we get a student to raise her standardized test score, will that improve her future. Are test results a good proxy for her future, and is improvement in those scores an indicator that her future has been improved? In other words, maybe we can get a student to raise her score-- but so what? Here's one person's thoughts:
If increasing test scores is a good indicator of improving later life outcomes, we should see roughly the same direction and magnitude in changes of scores and later outcomes in most rigorously identified studies. We do not. I’m not saying we never see a connection between changing test scores and changing later life outcomes; I’m just saying that we do not regularly see that relationship. For an indicator to be reliable, it should yield accurate predictions nearly all, or at least most, of the time.
That's Jay Greene, head honcho of the University of Arkansas Department of Education Reform. A couple of years ago he started casting some serious doubts at the idea that the BS Test was a good tool for accountability, mostly because there's no evidence that improved scores have any connection to improved life outcomes.
Or, in the terms of this new study, there's no reason to believe that what they are calling "returns" on investment are returns at all. Not only are test scores and barely-supportable score-based fairy tales about the future the wrong returns to focus on in education, they aren't even real returns at all. DeAngelis and Wolf haven't just focused on the wrong thing-- they've focused on a nothing. This isn't just zooming in on toenails-- it's zooming in on unicorn toenails.
Sigh. And yet Betsy DeVos and other reformsters are going to push this because the short headline form-- charters give more bang for your buck than public schools-- helps promote charters.
Thanks another good blog post Peter! Another thing I wondered about: isn’t NAEP a randomized assessment? Does NAEP break charter school data out from the rest? Maybe they do, but it doesn’t seem likely to me.
ReplyDelete