There are several outlets that have made named for themselves by playing the ranking game. These rankings may, I suppose, be used by folks trying to make a decision, but some days I think their primary uses are A) PR for the winners and B) shortcuts for writers who want to be able to toss in a quick "Pootwaddle High School is highly ranked" into an article.
US News is the granddaddy of school ranking publications with their annual listing of the tippy toppest high schools in the US. It's a list that I would recommend that we all ignore, based on the methodology by which the list is built. They depend heavily on the widely debunked VAM sauce, and-- well, I'm saying the system can be gamed, but in recent years US News gave the #4 spot to a NYC school that doesn't actually exist.
Another prodigious Ranker of Stuff is WalletHub, a site that focuses mainly on credit, but somehow got itself into the listicle biz. That includes ranking the 50 states (plus DC) for education. It's a list I've seen referenced several times in the past few days, but I think it can be safely ignored.
The problems, again, are with methodology. Ranking all the states according to educational quality seems like a pretty complicated and complex business, the kind that almost nobody actually wants to bother with, so has whipped up a list of proxies for quality, and they aren't great ones.
Out of 100 points, 20 are assigned for Safety, and that score is based on reported rates of various naughty behaviors like bullying and "participating in violence." There is also this sort-of-hilarious category: High School Students With Access to Illegal Drugs. Vague enough to be 100% for everyone, and based on self-reported numbers from students, this seems likely to be data noise. Looking at these categories ("Incarceration rate") also highlights the problems of lumping state systems together. There are around 500 school districts in PA. Some schools are located in big cities; some are located in corn fields. Lumping these together creates the same problems as averaging. As one wag noted on Twitter this week, Betsy DeVos and I own an average of five yachts each, but we could also say that among education workers of our generation (Betsy and I are about the same age), we've collectively accumulated ten yachts.
When we get to the actual quality measures which make up the other 80/100 points, we find more problems.
WalletHub counts the number of schools in US News top 700. Will that include the imaginary schools, or just real ones? WalletHub also counts Blue Ribbon schools. It counts the high school graduation rate for low income students, but as DC has dramatically shown us, you can graduate a whole bunch of students without even making them show up. The dropout rate counts. And of course, they count math and reading scores on the Big Standardized Tests, because everyone loves BS Test scores as proxies for academic success, even if there's no reason to believe they actually represent real academic success. Never mind-- they're nice neat numbers, so we'll make do.
Then we throw in AP exam results (how many scored 3 or higher) and median scores for SAT and ACT, plus the share of graduates who took the test as well as "division of results by percentile." So the pricy BS Tests count for a lot.
Finally, the pupil-teacher ratio and the share of licensed public K-12 teachers.
WalletHub also reports some of the top and bottom states for some of the specific measures, so there's that. But mostly your state is a great for education if your students are good test takers.
So remember-- when you see a WalletHub ranking of states education quality, just keep walking on by. We'll talk about their ranking of states teacher-friendliness another time.
No comments:
Post a Comment