Pages

Thursday, May 21, 2015

Is NAEP Really a Benchmark?

The recent Achieve study (the one with the Honesty Gap) is just the most recent example of someone using the NAEP (National Assessment of Educational Progress) as a benchmark test, as the gold standard of Sorting Students Out.

But not everybody agrees that the NAEP (aka "the nation's report card) is a good measure of, well, anything. Google "NAEP fundamentally flawed" (people seem to love that Fundamentally Flawed verbage when discussing NAEP) and you'll find lots to chew on.

Most debate centers around the leveling of the test. Folks don't care for how they're set. Many critics find them to be irrationally high. In 1999, the National Academy of Sciences released Grading the Nation's Report Card.  I can't direct you to a free copy of that to read, but I can summarize second-hand the basic arguments brought against the NAEP.

1) The results don't match the results of AP testing, finding fewer top students than the AP test does.

2) The NAEP gives more weight to open-ended questions.

3) The cut score lines are drawn in vague and hard-to-justify ways. NAS specifies this down to "You can't tell whether a kid just over the line will or won't answer a particular question correctly.

These arguments are not perfectly convincing. The Center for Public Education found, for instance, that NAEP's people had a pretty clear idea of how they were setting achievement levels.

A more damning report came from NCES way back in 2007, in turn looking back at students and test results in the nineties. That time span allowed researchers to do what folks looking at PARCC or SBAC still have not done-- follow up on later successes from the students. Here's a look at what the class of 1992 had done by the time eight years had passed.



NAEP Score
No Degree
Certificate
Assoc. Degree
Bachelor’s degree or higher
Below Basic
61.6
9.9
10.5
18.0
Basic
37.7
3.8
9.0
49.5
Proficient
18.1
0.4
2.5
79.0
Advanced
7.5
0.2
1.3
91.1


Note that 50% of students judged Basic went to college and earned a degree. It's almost as if they were, in fact, college and career ready. And in fact that is a frequent complaint about NAEP level setting-- that their "Basic" is everybody else's idea of "Proficient." Which would certainly explain the finding that state tests find far more proficient students than the NAEP does.

By 2009, deep into the reformy swamp, the government asked for another audit of NAEP, and got this report from the Buros Institute at the University of Nebraska. The report had some issues with NAEP as well:

1) No real validity framework, meaning no real framework for determining what the test actually measures nor what the data from the test can actually be used for.

2) The fact that no other tests, including various state tests, found the same results. This suggests that either NAEP has a singular unmatched vision, or it's out of whack.

3) There's no demonstration of alignment between NAEP and state standards and tests, which means using the test for matters such as, say, Achieve's Honesty Gap study, has no basis.

4) All this means that many "stakeholders" don't really know what they're looking at or talking about when it comes to NAEP scores.

My conclusion? The NAEP, like all other standardized tests, best functions as a measure of how well students do at the task of taking this particular standardized test. As soon as you start trying to figure out anything else based on the test results, you're in trouble. That includes writing fancy reports in which you suggest that states have an honesty gap.

1 comment:

  1. Eyeopening. Interesting and important stuff, especially that no other test shows the same result. Lack of replication shows not scientific, therefore invalid. Certainly not objective. I agree with your conclusion. I also conclude that the results of any standardized test seem to be arbitrary and therefore pretty much meaningless.

    ReplyDelete