It's that time of year again, I guess, for the National Alliance for Public [sic] Charter Schools to issue its annual report of the Health of the Charter Public [sic] School Movement. they're scoring and ranking states, because remember-- when you go to the doctor, it doesn't matter how healthy you are, only whether you are more or less healthy than the other patients in the office.
There is some interesting-ish information to be gleaned from the report, but the report is 180 pages long, so if you want the full effect, you are going to have to read it all yourself. This time I'm not doing it for you.
What Are We Doing, and How Are We Doing It?
To make the report's List O'Charter Swellness, states had to meet a few criteria: At least 2% of public school students had to be charter-enrolled in 2014-2015, the state had to have participated in the CREDO study, and it had to have a system for "categorizing" schools.
Eighteen schools made that cut and were then judged by four criteria. Two "quality measures" come from the CREDO 2013 report, which in turn uses data from 2011. So be warned-- a whole lot of the basis for this report's findings is not actually current stuff. In fact, unless I'm missing something, it will be exactly the same data used in last year's Charter Health Report. So to get past that, NAPCS has added two new data criteria-- has the number of top-rated charters increased, and has the number of bottom-rated charters decreased?
They also wanted to come up with a way to include innovation, because innovation is a primo quality of charters, and yet oddly enough, they have found it's to measure innovation in a standardized way. This is not the last time I'll feel as if the NAPSC had a chance to Learn An Important Lesson but just breezed on by. Yes, measuring innovative creative divergent thinking in a standardized way is basically impossible-- but we will still base most of our findings on student test scores from Big Standardized Tests. Sigh.
And, oh, look-- they're going to do it again--
Last, we acknowledge that our definition of a healthy movement is limited by what data we can collect across states. Several other elements of a healthy movement are not included here because we cannot measure them. But that doesn’t mean they are not important. For example, quality beyond test scores can be determined several ways, some of which are more qualitative in nature. A healthy movement needs to have charter schools that are not only succeeding on state tests but also knocking it out of the park on these other determinants of quality.
And so they go on to acknowledge that the whole basis of their rating system and all of NCLB/RTTT reforminess is a foundation built of rotting timbers and sifting sand, and they announce we should stop pretending that these junk ratings mean anything, and then they go on to fill the rest of the report with adult coloring book pages! Ha!! Just kidding. They say, "Well, yes, this system ignores most of the important parts of being a school, but we'll go ahead and use it anyway." That sound you just heard was my palm hitting my forehead.
Here's the weighting system they will use to rate things:
Items 1-8 are for Growth, 10-13 are for Quality, and poor little 9 with its measly two weight must represent for Innovation. So innovation, apparently not all that important after all. And we'll later learn that "innovations" include arts schools and Montessori/Waldorf schools and vocational schools and STEM schools and, for the love of God, No Excuses schools. So "innovation" can include ideas that have been around for decades, or which have quickly established themselves as bad ideas. And despite their concerns, they did manage to fit all the innovations into some standardized categories. Yay, innovation.
I'm also going to point out, as always, that the measuring of learning in days is bogus and a little bit silly. "Mrs. Bogwaller, we're happy to tell you that Chris is a full five days ahead, though we suspect all of those days are Fridays, so it may not be that great news." Is that learning-per-average day? Do we think some days are more learny than others, or is this a constant? Does a child learn the same amount on a birthday as on, say, a Sunday? How did anybody ever break learning down into days? Did somebody study a few thousand children and test them at the end of every single day to get an average learn-per-day figure? And exactly how did that researcher measure quantities of learning? Do you measure out learning by the gram, or by the liter, or by the meter, or do we measure out their lives in coffee spoons (and could we then--please-- name the learn-per-day units "prufrocks"?) Can we talk about single days of learning, or must they travel in a pack? And if we can measure that a student is a single day of learning ahead, how much further can we break that down? Hours? Minutes? Seconds?
Sorry. But the whole days of learning thing is just so silly, and proof once again that when education commentators want to be able to measure something, and can't, they will come up with all manner of solemn baloney to fake it.
So How Did the States Do?
The weights add up to thirty-three with four possible points for each, for a grand total of 132 possible points. Each state gets a score and a rank, and congratulations, Washington DC-- you are first with 106 points, leaving Indiana a distant second with a mere 88. From there we plummet down to last-place Oregon, with a skimpy 45 points. There are some interesting details here. Massachusetts, which is still enduring a wrestling match between charter-loving leaders and the entire actual public school system-- Mass comes in ahead of charter-lovin' Louisianna and Reformster Jeb! Paradise Florida. Ohio, which is either a dreamy charter wild west or a nightmarish charter trainwreck depending on who's assessing-- Ohio is way down at #13.
The report also spends some time holding the Health Checkup rankings against the State Charter Public [sic] School Law Ranking, but there's nothing earth-shattering there and watching a group looking for a correlation between their made-up ranking system and their other made-up ranking system turns out to be as much fun as watching the ink on a charter contract dry. So let's move on.
Pennsylvania-- An Example of a Health Report
The report uses the vast bulk of its pages to take a state-by-state look at charter health, and that includes the other states beside the 18 that didn't make the cut (which is not all fifty-- if you don't even have a charter law in place yet, you're not in this report. Sorry, Kentucky. Also, if your charter law was thrown out by your Supreme Court. Sorry, Washington.)
I'm going to walk through the report on Pennsylvania, because that's where I am. This will give you an idea of some of the pitfalls in the report. You can decide on your own whether you want to sneak a peek underneath your state's charter hospital gown.
I'll tell you up front that Pennsylvania's data will reflect that we are a haven for crappy cyber-charters. I'm betting that is why, for instance, the percentage of charters on the state's naughty list went from 60% to 66%. Why the state doesn't just shut down these cyber-cesspools of educational malpractice is a mystery for another day.
There are other bullet points about the Keystone state, but the report writers also have some nicely designed charts for your perusal, covering the same data in a more graphically delightful manner and following the layout of the chart above. They really have done a good job of formatting things so that it's easy to follow the same ideas all the way through.
PA has 7% of our students in charters, and those are 6% of our schools.
The comparison breakdown of race and ethnicity is, well, kind of useless. They compare the charters against the state, but in any state where the population varies as much as ours, that's meaningless. Pennsylvania is very rural except for the parts that are very urban, and very non-white except for the parts that are very white. In other words, while the state student population may be 73% white, 13% black, and 9% hispanic as a whole, I'd be surprised if you could find any community in the state that matches that demographic breakdown. So comparing charter demographics to that means nothing- the only comparison that really matters is whether or not charters are educating the same population as the local school, and one of the secrets to charter success continues to be making sure that they do NOT try to educate the same population as local schools.
Fun factoid- PA charters are far more centered in cities and suburbs than public schools. 25% of our schools are rural (a nearby district educates about 400 students in one K-12 building serving half of the entire county-- that rural) and no charters other than the cybers have figured out how to make bank serving that population. And because local districts are still the main authorizers, volunteers willing to slit their own financial throats are few and far between.
The rate of charters opening has been slowing down. The rate of closing is a little more stable, but overall rising. About 50% of PA charters have an "innovative" special focus.
Remember the report about how cyber-charters move students backwards? That figures in this report and undoubtedly really hurt charter numbers for "number of days of learning" for charters, which are hugely negative. And while the numbers for charters in the top and bottom categories of PA's school rating system are also dismal, I will give them a pass because our school evaluation system is a hot, ugly mess.
Oh, and we finish with some unscored data, including the data that 27% of our charter students are cyber-students, which is lower than I would have guessed. Now I should probably go back and remove all the places where I blamed crappy charter results on the cybers-- apparently plenty of those Philly charters are able to stink up the place all on their own.
I give NAPSC credit for reporting data without trying to hide it, spin it, or obscure it behind too many piles of smoke and mirrors, just as I give some folks in the charter movement credit for having figured out that if they don't clean up their own bad actors, the whole industry is going to look worse and worse (I think "worse and worse" is inevitable for the current incarnation of the charter industry, but that's another conversation). This report has some interesting-ish data collected in one handy spot, and it's worth a few minutes to check out the picture of your state and see just how diseased you are.