Pages

Friday, November 27, 2015

Accelerated Reader Research Part 2

A little while ago I took a look at this silly piece of faux research from the Accelerated Reader people. But there was one puzzle I couldn't quite solve.

The study was reported as concluding that just a few minutes more reading time would produce fabulous results, but I wondered exactly how the researchers knew how much time the readers had spent on their independent reading.

Much ado is made in the report about the amount of time a student spends on independent reading, but I cannot find anything to indicate how they are arriving at these numbers. How exactly do they know that Chris read fifteen minutes every day but Pat read thirty. There are only a few possible answers, and they all raise huge questions.

In Jill Barshaw's Hechinger piece, the phrase "an average of 19 minutes a day on the software"crops up. But surely the independent reading time isn't based on time on the computer-- not when so much independent reading occurs elsewhere.

The student's minutes reading could be self-reported, or parent-reported. But how can we possibly trust those numbers? How many parents or children would accurately report, "Chris hasn't read a single minute all week."

Or those numbers could be based on independent reading time as scheduled by the teacher in the classroom, in which case we're really talking about how a student reads (or doesn't) in a very specific environment that is neither chosen nor controlled by the student. Can we really assume that Chris reading in his comfy chair at home is the same as Chris reading in an uncomfortable school chair next to the window?

Nor is there any way that any of these techniques would consider the quality of reading-- intensely engaged with the text versus staring in the general direction of the page versus skimming quickly for basic facts likely to be on a multiple choice quiz about the text.

The only other possibility I can think of is some sort of implanted electrodes that monitor Chris's brain-level reading activity, and I'm pretty sure we're not there yet. Which means that anybody who wants to tell me that Chris spent nineteen minutes reading (not twenty, and not eighteen) is being ridiculous.


When I was working on the piece, I tweeted at the AR folks to see if they could illuminate me. I didn't get an immediate response, which is not significant, because it's twitter, not registered mail. But I did hear back from them a bit later, and they directed me to this explanation from one of their publications about AR (it's page 36).

The Diagnostic Report also shows a calculation called engaged time. This represents the number of minutes per day a student was actively engaged in reading. To calculate this number, we look at the student’s GE score on STAR Reading and how many points the student has earned by taking AR quizzes. We compare that to the number of points we can expect the student to earn per minute of reading practice. Then we convert the student’s earned points to minutes. 

For example, let’s say Joe Brown has a GE score of 6.5. Our research tells us that a student of his ability can earn 14 points by reading 30 minutes a day for six weeks. Joe has earned only seven points. Thus we estimate Joe’s engaged time to be only 15 minutes a day. 

If a student’s engaged time is significantly lower than the amount of time you schedule for reading practice, investigate why. It could be that classroom routines are inefficient or books may be hard to access. Since low engaged time is tied to a low number of points earned, see the previous page for additional causes and remedies.

So, not any of the things I guesses. Something even worse.

They take the child's score on their proprietary reading skill test, they look at how many points the child scored, and they consult their own best guess at how long a student with that score would take to earn that many points-- and that's how much time the child must have spent reading!

What if something doesn't match up? What if the AR reverse-engineered time calculation says that Chris must have taken thirty minutes of reading to get that score, but you gave Chris an hour to read? Well then-- the problem is in your classroom. Chris is lollygagging or piddly-widdling. Or the books are on too high a shelf and it took Chris a half hour to get it. Whatever. The problem is not that AR's calculations are wrong.

And of course this doesn't so much answer the question as push it up the line. Exactly what research tells you that a student with STAR rating X must use fifteen minutes of reading to achieve Y number of points on the AR quiz?

My confidence in the Accelerated Reading program is not growing, and my confidence in their research skills, procedures or conclusions, is rapidly shrinking.

2 comments:

  1. Peter, I agree with your criticism: the "reading minutes" within AR are not true minutes; they're a guesstimate. As an elementary school teacher, I've found AR to be a valuable tool, especially in helping motivate kids to read more and to differentiate between "piddly-widdling" and reading. While those 'minutes' aren't real minutes, and the grade-level calculations fall apart rapidly, there is a huge difference between no detectable reading, and frequent reading. Kids have a great amount of choice within AR. When you refer to AR's list of titles, something much shorter than the 129,000 books they have quizzes for comes to mind. And yup, comprehension questions are at the bottom of Bloom's taxonomy. Many essays have been written poking fun at the work of students who don't understand what they read. Marie Clay called reading (aloud) without comprehension "barking at print." I value AR because it allows me to challenge students, even very low readers, to read books they choose, and learn the difference between understanding what they read, and merely pronouncing words (out loud or silently). Thanks for this particular challenge, and a great blog in general.

    ReplyDelete
    Replies
    1. Thanks for reading. I suspect that AR is a useful tool when in the hands of a good teacher who controls its use and is not controlled by the program.

      Delete