Pages

Friday, July 28, 2017

Research Shmesearch

In what will come as practically no surprise at all to people who work in schools, a recent survey suggests that peer-reviewed research does not have much effect on what ed tech products a school district purchases.


The survey (ironically not itself a piece of peer-reviewed research) comes with the unsexy title Role of Federal Funding and Research Findings on Adoption and Implementation of Technology-Based Products and Tools. It's also entirely fitting that the group that produced this, as reported by EdWeek Market Brief,  "emerged from a symposium staged earlier this year by Jefferson Education Accelerator, a commercial project that pairs education companies with school districts and independent researchers; and Digital Promise, a nonprofit that tries to promote the effective use of research and technology in schools." So, a "study" by people who have a stake in the result. In fact, maybe not so much "study" as "market research."

At any rate, the survey covered 515 respondents in 17 states. 27% were teachers, with the rest a mix of administrators and district tech folks.The internet-based survey went out as a link on social media, so not exactly a random sampling here.

The study launched on the notion, "Hey, the government is spending a bunch of money funding studies so it can collect evidence-based stuff on its What Works Clearinghouse website. Do you suppose that anybody in school districts cares about either the results or the standards used?"

The answer, apparently, is "no."

While 41% would give "strong consideration" to a program with peer-reviewed research, only 11% would rule the product out if there were no such research. Respondents were less impressed by "gold standard" research. Hardly anyone in the sample was impressed by non-peer-reviewed research. (Nothing in the study shows if respondents can tell the difference.)

Bonus points to the study for asking if respondents cared if the research were performed on students comparable to their own. It's a good question to ask-- too much "education" research has been performed on subjects of convenience, giving us findings about learning among small samples of college sophomores.

For perspective, we can note that several items were far bigger dealbreakers than peer-reviewed research. A whopping 29% said they would not buy a program unless the data output was accessible. 19% would reject a program if it were not customizeable, or the data were not interoperable with district programs. 16% would reject a program if privacy options couldn't be customized or if the program was not useful for students with disabilities. 13% would rule out a program unless implementation support was available. So all those things-- more important than peer-reviewed research.

Research Lead Dr. Michael Kennedy (University of Virginia) provided some additional interpretation to Ed Week:

There’s a disconnect between what researchers think is high-quality research and what school districts think,” he said in an interview. Despite school officials’ interest in weighing evidence, for many, their attitude is, “when push comes to shove, I’m buying what I’m going to buy,” said Kennedy.

Emphasis mine, because duh.  The report itself also includes some quotes from respondents indicating that a federal stamp of approval isn't that big a deal.

If the product was developed using federal grant dollars, great, but the more important factor is the extent to which it suits our needs.

Features and functionality are what I look for. Endorsement from the feds is nice icing on the cake – But cake still tastes pretty good with or without that icing.

In other words, district official trust their own eyes first. (Also, mmmm, cake.) Kennedy also points out that the create-research-review process can take so long that the product is obsolete by the time it's recommended. Kennedy also allowed that research can be so narrow that it only "proves" a product works in very specific situations, and if those situations aren't the ones your district is dealing with, what good is the research? Not that vendors don't frequently pitch ed tech with some variation, "Well, if you just change your circumstances and environment and procedures and goals, this product will be just perfect for you."

Which suggests at least one more reason that districts don't pay a lot of attention to research-- it is most commonly encountered as part of a sales pitch. The report discusses these ed tech products in almost neutral tones, as if districts are just deciding which flower to pick from their garden. But in fact what we're talking about is a host of vendors trying to sell a product, and in that context we all know that whatever research is included is there to serve the sales pitch. Is there research suggesting that the Edtech Widgetmaster 5000 has no real effect on student achievement? It's a sure bet that the Edtech Widgetmaster sales rep will not be bringing up those studies.

In short, we all know that everything presented to us is presented to help make a sale. Of course all the research makes the product look good-- because it's chosen by the company selling the product. If a used car salesman tells you the engine in the car you're looking at are just great, are you going to take his word for it, or are you going to look under the hood yourself?

But we're talking about federally-backed research! Surely we can trust the feds to be impartial. Man, I could only just barely type that whole sentence. As the last decade-and-change have shown us, the feds are just as invested in selling their own products and views as any corporation (in fact, they're often busy selling a corporation's product for them).

In the end, a wise school district does not let "But the research!!" drown out the still small voice of "caveat emptor." The report includes recommendations that school districts depend on more research and even that policy makers consider twisting districts' arms in this regard. Since policy makers have consistently ignored the research about vouchers and cyber-schools and Big Standardized Tests, I wouldn't hold my breath waiting for them to jump on this boat (unless their favorite corporation wanted them to force districts to buy the corporate product).

The people who have the best idea of what needs have to be met in a school district are the people who work in that school district. Research is nice, but if you're doing the buying for your district, using your own eyeballs and brain parts and advice from your people is still the best way to approach edtech vendors. When it comes to our own classrooms, we are the experts, and the research that matters most is our own.

2 comments:

  1. "If a used car salesman tells you the engine in the car you're looking at are just great, are you going to take his word for it, or are you going to look under the hood yourself?"

    Well, it wouldn't matter if I looked under the hood. Unless the engine was literally falling out, I wouldn't know what I was looking at. But then, that's why I wouldn't buy a used car on my own without an expert along to check the engine for me.

    But the problem with so much edtech is that the people who get to do the buying are people who have no business looking under the hood of a car, and for some reason they rarely bother to bring an expert along with them.

    ReplyDelete
  2. I think this is a fascinating piece, and so well done. In our district , the "advice from your own people" doesn't seem to reach down into the people in the classroom. Research is talked about fairly frequently by the admins, presenting all sorts of glowing data, but I'm not sure people really understand it, as it is frequently highly flawed or being misinterpreted.

    ReplyDelete