Pages

Friday, September 21, 2018

Field Guide To Bad Education Research

Folks in education are often  criticized for not using enough research based stuff. But here's the thing about education research-- there's so much of it, and so much of it is bad. Very bad. Terrible in the extreme. That's understandable-- experimenting on live young humans is not a popular idea, so unless you're a really rich person with the financial ability to bribe entire school districts, you'll have to find some clever ways to work your research.

The badness of education research omes in a variety of flavors, but if you're going to play in the education sandbox, it's useful to know what kinds of turds are buried there.

The Narrow Sampling

This is the research that provides sometimes shocking results-- "Humans Learn Better While Drinking Beer." But when you look more closely, you discover the sample size lacks a little breadth-- say, fifteen Advanced Psychology male college juniors at the University of Berlin. These may be experimental subjects of convenience; the above researcher may have been a U of B grad student who worked as a TA for the Advanced Psychology course.

Generally these narrow projects yield results that are not terribly useful, but if you're out shopping for research to back whatever you're selling, these can often provide the "research base" that you wouldn't otherwise find.

The Meta Study

Meta research involves taking a whole bunch of other studies and studying the studies in your study. The idea is to find patterns or conclusions that emerge from a broad field of related research. Met research is not automatically bad research. But if the meta researcher has gone shopping for studies that lean in his preferred direction, then the pattern that emerges is-- ta-da-- the conclusion he went fishing for.

This is a hard thing to check. If you know the literature really well, you might look for which studies are not included. But otherwise just keep a wary eyeball out.

The Not Actually A Study

These are cranked out pretty regularly by various thinky tanks and other advocacy groups. They come in nice slicky-packaged graphics, and they are not actual research at all. They're position papers or policy PR or just a really nicely illustrated blog post. There are many sleight of hand tricks they use to create the illusion of research-- here are just two.

Trick One: "Because there are at least ten badgers in our attic, many of the neighbors took to drinking pure grain alcohol." There will be a reference for this sentence, and it will provide a source for the number of badgers in the attic. Nothing else, including the implied cause and effect, will be supported with evidence.

Trick Two: "Excessive use of alcohol can lead to debilitating liver disease. The solution is to sell all alcoholic beverages in plastic containers." References will shore up the problem portion of the proposal, establishing clearly that the problem is real. Then the writers' preferred solution will be offered, with no evidence to support the notion that it's a real solution.

The Not Really A Study is also given away by the list of works cited, which tend to be other non-studies from other advocacy groups (or, in the case of ballsy writers, a bunch of other non-studies from the same group). No real academic peer-reviewed research will be included, except a couple of pieces that shore up unimportant details in the "study."

The Thousand Butterfly Study

Like studies of other human-related Stuff (think health-related studies), education studies can involve a constellation of causes. When researchers study data from the real world, they may be studying students over a period of time in which the teaching staff changed, new standards were implemented. administration changed, new standardized tests were deployed, new textbooks were purchased, the cafeteria changed milk suppliers, a factory closed in town, a new video game craze took off, major national events affected people, and any number of imponderables occurred in student homes. The researcher will now try to make a case for which one of those butterflies flapped the wings that changed the weather.

Some butterfly researchers will try to create a compelling reason to believe they've picked the correct butterfly, or what is more likely, they will try to make a case that the butterfly in which they have a vested interest is the one with the power wings. This case can never not be shaky; this is a good time to follow the money as well as the researcher's line of reasoning.

The worst of these will simply pretend that the other butterflies don't exist. The classic example would be everyone who says that the country has gone to hell since they took prayer out of school; crime rates and drug use and teen pregnancy, the argument goes, have all skyrocketing as a result of the Supreme's decision-- as if nothing else of importance happened in 1962 and 1963.

The Bad Proxy Study

Education research is tied to all sorts of things that are really hard, even impossible to actually measure. And so researchers are constantly trying to create proxies. We can't measure self-esteem, so let's count how many times the student smiles at the mirror.

Currently the King of All Bad Proxies is the use of standardized test scores as a proxy for student achievement or teacher effectiveness. It's a terrible proxy, but what makes matters worse is the number of researchers, and journalists covering research, who use "student achievement" and "test scores" interchangeably as if they are synonyms. They aren't, but "study shows humus may lead to higher test scores" is less sexy than "humus makes students smarter."

Always pay attention to what is being used as a proxy, and how it's being collected, measured, and evaluated.

The Correlation Study

God help us, even fancy pants ivy league researchers can't avoid this one. Correlation is not causation. The fact that high test scores and wealth later in life go together doesn't mean that test scores cause wealth (wealth later in life and high test scores are both a function of growing up wealthy). The best thing we can say about bad correlations is that it has given rise to the website and book Spurious Correlations.

Just keep saying it over and over-- correlation is not causation.

The Badly Reasoned Study and The Convenient Omission Study

For the sake of completeness, these old classics need to be included. Sometimes the researchers just follow some lousy reasoning to reach their conclusions. Sometimes they leave out data or research that would interfere with the conclusion they are trying to reach. Why would they possibly do that? Time to follow the money again; the unfortunate truth of education research is that an awful lot of it is done because someone with an ax to grind or a product to sell is paying for it.

The Badly Reported Study

Sometimes researchers are responsible and nuanced and careful not to overstate their case. And then some reporter comes along and throws all that out the window in search of a grabby headline. It's not always the researcher's fault that they appear to be presenting dubious science. When in doubt, read the article carefully and try to get back to the actual research. It may not be as awful as it seems.

Keep your wits about you and pay close attention. Just because it looks like a legitimate study and is packaged as a legitimate study and seems to come from fancy scientists or a snazzy university-- well, none of that guarantees that it's not crap. When it comes to education research, the emptor needs to caveat real hard.


No comments:

Post a Comment