Friday, June 5, 2020

Do Exit Exams Reduce Crime

Of all the claims made about high school exit exams, this has to be one of the most unlikely, but here comes Matthew Larsen in the ever reformy Education Next to argue that exit exams--tests that are required to get a diploma--reduce crime.

Larsen is an assistant professor at Lafayette College in Pennsylvania. He's in the economics department, because of course he is. He set out to look at whether exit exams or increased course requirements had an effect on crime statistics. "Conventional belief holds that more and better-quality education reduces crime," he reasons. "Could exit exams improve teaching and learning in high schools such that criminal activity drops?"

The answer he came up with is that exit exams reduce the arrest rate, mostly for property crimes, but that increased course requirements do not. How did he come up with such sexy findings?

He collected FBI arrest data for 15-24 year olds from 1980 to 2010. He assumed that everyone was committing crimes in the same state in which they attended high school, and that they graduated when they were 18. There was also some estimating going on, like estimating the general age and gender distribution of the jurisdictions of the various police departments.

After that, it's pretty basic. If the state implemented exit exams in 2005, Larsen compares the arrest rates for the people who graduated before 2000 with that of the people who graduated afterwards. Larsen claims that by including cohorts in the same year (e.g. the arrest records for 2003 would include both cohorts that graduated before and after the test was implemented) he eliminates other factors, like police department staffing. Except, fo course, those overlap years would be a relatively small set within the 1980-2010 span. Nor does it correct for a variety of other factors-- graduates from, say, 1998, lived in a different world than those graduating in, say, 2002. Larsen claims to adjust, somehow, for factors like average teacher-pupil ration and average teacher salary and per-pupil expenditures, and I suppose that economists may have magical tools that can do this but that still leaves us with the problem that 1) he does not know how the arrest rates break down by high school and 2) no students attended schools with average conditions. The average height of people in my house is about 4.5 feet, but if you buy us clothes to fit that height, they won't fit anyone who lives here.

But Larsen makes his computations and somehow feels confident enough to write this:

I assume that, after making these adjustments, the only major difference between students from different graduation cohorts is that one group faced tougher graduation requirements.

Which is kind of nuts.

Once he gets a'crunchin', Larsen finds that there's no real effect for course requirements, and that the exit exam effect is greatest for poor white kids.

Larsen doesn't offer many compelling explanations for any of this. Maybe, he muses, the exit exams cause "more advantaged" students to get smarter and more knowledgeable and therefor commit fewer offenses, but make the poors drop out and turn to a life of crime (no thoughts about any of the biases or arrest patterns for advantaged vs. disadvantaged students). Maybe the pressure of exit exams makes schools do a better job. Or maybe the exams "boost the perceived value of a high-school diploma." Or students improve their attendance patterns because they want to get ready for those exams.

Curiously, though several states have dropped then exit exam requirement, Larsen apparently did not do any research to see if arrest rates went up aftewards.

My explanation for Larsen's results is that his data is filled with so much noise that any conclusions fall somewhere between "improbable" and "silly." Even if he has somehow scraped real data from the jaws of a gazillion different factors that could explain the results, we're still staring straight into the face of our old friend, spurious correlations.

Spurious correlations is a glorious website (and also a book) that helps illustrate why mistaking correlation for causation causes nothing but trouble. Some of these almost, kind of, if you squint, make sense.


But others are clearly ridiculous. And yet there's the chart, with numbers and chartiness so it must be, you know, science. Oh, Nic Cage-- when will the madness stop?



















It's a weak argument, weakly supported. The only upside here is that I haven't seen the news of exit exams' crimefighting magic trumpeted from the websites of the usual suspects. Let's hope it fades quietly away, like a mediocre Nic Cage movie.



3 comments:

  1. Exit Exams decreasing crime rates; poppycock!. However, they do make teeth (and shirts) whiter, windows cleaner, and people who claim that they reduce crime look stupider.

    ReplyDelete
  2. Too laughable for even the usual suspects to pick this up.

    I actually don't think it's noise. I think it's crack. Not that he's on. But there is an epidemic in the midst of one of his data sets. Which means that each of the following would provide correlation: Having viewed the movie Titanic. Remembering 9/11. Watching Bill Clinton's impeachment hearings. Watching Spongebob. Childhood use of Facebook.

    I think having a spurious correlation website is like cheating - much more fun to find on your own. Or for those of us whose reading levels are more appropriate for graphic displays of people talking, Randall Munroe to the rescue: https://xkcd.com/925/

    ReplyDelete
  3. Stat geek here. Be careful of criticizing “ noise in the data” since random noise in the data would not be a problem. In fact if the data is highly variable due to random sources of variability and he still found a statistically significant relationship between exams and crime that actually would make it a stronger relationship since it would have to be to overcome all the noise. The oriblrm isn’t ta don’t noise it’s non random noise. The actual causal factors that are covsrying with the resumed cause he wants to make a case for. If you know of likely covariates you can statistically suck those out of the data. Which based on what you say, sounds like he did. But you can’t suck out the ones you don’t know about or didn’t measure somehow. And there are an infinite possible number of those especially in this case.

    ReplyDelete