Saturday, July 26, 2014

College Students vs. Faux Journalism

HuffPo recently ran what could be called a graphics-rich story (basically two big graphics plus captioning) that is sure to have some folks sounding the alarm bell, and there's no question that the data are striking.

The basic takeaway is this-- in no states in the US do the majority of students finish college in four years. Virginia is up top with 46%, with Nevada and DC bringing up the rear at 8.75% and 3% respectively. There's also a graphic for six-year graduation rate, but that picture isn't very pretty, either.

Of course, what's missing from the story is some perspective. So with some very quick and unsubtle help from my research assistant, Dr. von Google, I checked to see if this looked any worse than the US track record. I did nothing more strenuous than what any person with a computer, a desk, and a half hour to kill (or spend researching a story) could do.

Here's a 2010 piece by our old friend Kevin Carey at the awesomely named blog The Quick and the Ed. The title gives away the game-- "U.S. College Graduation Rate Stays Pretty Much Exactly the Same."

Carey makes two points. One is that looking at percent of adults with college degrees doesn't show much movement in the US over time-- about 30% "four year" degrees plus another 10% associate degrees. That fits with this 2002 chart from the OECD that shows US degrees are about the same for both the young generation and the older.
Carey's second point is that getting graduation rates in X years is hard because colleges generally know who finished at their own school, but not whether their drop outs successfully finished college elsewhere. Carey then goes on to explain the rather convoluted means by which federal statisticians come up with such a figure, depending on something called BPS.

The source for the infographics was a site called Find the Best which is a fun little site that crunches numbers for everything under the sun. But my search of the site turned up neither this particular project not the methodology for it-- all we know is that they used IPEDS data. So, grains of salt at the ready.

I found other interesting charts and data sets, like this one looking at college completion rates for African American students/athletes

That came from the same article as this chart, with the article acknowledging that students who droped ou and finished elsewhere counted against the institution at which they started.

And an abstract of this paper  for just a couple years back which I am sure to revisit, that suggests a couple of things:

    * From 1979 to 1997 there is a growing gap between rich and poor students in terms of college entrance, persistance, and graduation
    * There is also a growing gender gap-- women are outpacing men
    * However, the inequity gap grew much more sharply among women than men

Or there's this graphic, from CAP of all places





In fact, simple straightforward data about college completion rates is not all that prevalent, suggesting that this yet another conversation we're having without the benefit of lots of facts.

But more than that, I want to point out that once again, we're looking at lazy reporting. It has been literally forty-five minutes since I sat down and started working on this story. How hard would it be for someone who is doing journalism as their actual Real Job to spend some time adding some context, nuance and data to a story instead of just saying, "Wow-- cool graphics. And I can write the whole story in one sentence." Yes, I realize criticizing HuffPo journalistic standards is a little like criticizing Arctic beaches, but being HuffPo is not an excuse to be lazy.

This is a complicated issue, from the assumptions we start with (exactly why is it critical that a college degree be completed in four years) to the data we look at (how can we really know how many people started and finished when they move around so much). It deserves more than a quick couple of infographics that by themselves don't tell us much of anything.



1 comment: