Showing posts with label Achieve. Show all posts
Showing posts with label Achieve. Show all posts

Thursday, May 21, 2015

Is NAEP Really a Benchmark?

The recent Achieve study (the one with the Honesty Gap) is just the most recent example of someone using the NAEP (National Assessment of Educational Progress) as a benchmark test, as the gold standard of Sorting Students Out.

But not everybody agrees that the NAEP (aka "the nation's report card) is a good measure of, well, anything. Google "NAEP fundamentally flawed" (people seem to love that Fundamentally Flawed verbage when discussing NAEP) and you'll find lots to chew on.

Most debate centers around the leveling of the test. Folks don't care for how they're set. Many critics find them to be irrationally high. In 1999, the National Academy of Sciences released Grading the Nation's Report Card.  I can't direct you to a free copy of that to read, but I can summarize second-hand the basic arguments brought against the NAEP.

1) The results don't match the results of AP testing, finding fewer top students than the AP test does.

2) The NAEP gives more weight to open-ended questions.

3) The cut score lines are drawn in vague and hard-to-justify ways. NAS specifies this down to "You can't tell whether a kid just over the line will or won't answer a particular question correctly.

These arguments are not perfectly convincing. The Center for Public Education found, for instance, that NAEP's people had a pretty clear idea of how they were setting achievement levels.

A more damning report came from NCES way back in 2007, in turn looking back at students and test results in the nineties. That time span allowed researchers to do what folks looking at PARCC or SBAC still have not done-- follow up on later successes from the students. Here's a look at what the class of 1992 had done by the time eight years had passed.



NAEP Score
No Degree
Certificate
Assoc. Degree
Bachelor’s degree or higher
Below Basic
61.6
9.9
10.5
18.0
Basic
37.7
3.8
9.0
49.5
Proficient
18.1
0.4
2.5
79.0
Advanced
7.5
0.2
1.3
91.1


Note that 50% of students judged Basic went to college and earned a degree. It's almost as if they were, in fact, college and career ready. And in fact that is a frequent complaint about NAEP level setting-- that their "Basic" is everybody else's idea of "Proficient." Which would certainly explain the finding that state tests find far more proficient students than the NAEP does.

By 2009, deep into the reformy swamp, the government asked for another audit of NAEP, and got this report from the Buros Institute at the University of Nebraska. The report had some issues with NAEP as well:

1) No real validity framework, meaning no real framework for determining what the test actually measures nor what the data from the test can actually be used for.

2) The fact that no other tests, including various state tests, found the same results. This suggests that either NAEP has a singular unmatched vision, or it's out of whack.

3) There's no demonstration of alignment between NAEP and state standards and tests, which means using the test for matters such as, say, Achieve's Honesty Gap study, has no basis.

4) All this means that many "stakeholders" don't really know what they're looking at or talking about when it comes to NAEP scores.

My conclusion? The NAEP, like all other standardized tests, best functions as a measure of how well students do at the task of taking this particular standardized test. As soon as you start trying to figure out anything else based on the test results, you're in trouble. That includes writing fancy reports in which you suggest that states have an honesty gap.

Friday, May 15, 2015

Honesty: The Hot New Gap (With Anti-CCSS Bonus)

A new report from Achieve.org doesn't provide a lot of information, but it has opened up a great talking point Gap-- ladies and gentlemen, may we introduce the Honesty Gap!

The report, "Proficient vs. Prepared: Disparities between State Tests and the 2013 National Assessment of Educational Progress (NAEP)" -- well, actually, that title pretty well covers it. Achieve compared Big Standardized Test results to NAEP results.

Achieve, you may recall, was one of the groups instrumental in creating Common Core and foisting it on American schools. So we can't be surprised when their stance is somewhat less than objective.

Today’s economy demands that all young people develop high-level literacy, quantitative reasoning, problem solving, communication, and collaboration skills, all grounded in a rigorous and content-rich K-12 curriculum. Acquiring these skills ensures that high school graduates are academically prepared to pursue the future of their choosing.

Two sentences. The first one sounds lovely, if rather limited, and is an opinion that I'm sure many folks share (at least in part). The second is another iteration of the unproven belief that such a list of qualities will lead to academic preparation. But then, in the next sentence, in bold typeface-- we make a huge, huge leap.

Many state tests, however, continue to mislead the public about whether students are proficient. Parents, students, and teachers deserve transparency and accuracy in public reporting.

This statement assumes and implies that "proficient" is a measure of students development of the list above. It is not. It is a score from one badly designed, non-validated Big Standardized Test that does not have a hope of measuring any of those high function skills (not to mention "collaboration," which is of course expressly forbidden).

I do like the call for transparency. Does this mean that Achieve is going to call for an end to the Giant Cone of Secrecy around the test, and that states should no longer be required to serve as enforcement arms for protecting the proprietary rights of test manufacturers over the educational interests of students? No, I didn't think so.

BS Tests are measuring tools that have never been checked. It's like somebody holds up a length of string and says, "Yeah, that is what I imagine a yard should be, more or less" without ever grabbing a yardstick. Now, Achieve is shocked-- shocked!!-- to discover that the various states' pieces of string aren't exactly a yard long.

But their framing of it is, well, exquisite. States that have BS Test scores that come (somehow) in line with their NAEP scores are called the Top Truth-Tellers. The big gap states are not called Top Dirty Rotten Liars, but hey, if the shoe fits. This raises a few questions, such as how one compares the state-level BS Tests with the NAEP (maybe, it seems, just by counting the number who pass or fail).

More importantly, it raises this question: if the NAEP is the gold standard for measuring all that cool stuff about student achievement, why don't we just use it and scrap all the state-level BS Tests?

Reformsters are skipping right past that to The Honesty Gap. It's a more formal version of the old assertion that schools and teachers are just lying to their students and ed reform has to include telling parents and students that they and their schools and their teachers all suck.

Not surprisingly, the Honesty Gap has shown up in pieces by Mike Petrilli at Fordham and at the Reformster Website To Which I Will Not Link. And those pieces are not a surprise because the Honesty Gap has recently launched its very own website!! Woo hoo!! That website was launched by The Collaborative for Student Success, an advocacy group with most excellently reformy partners,
including the Fordham Foundation, the US Chamber, and even-- oh, look! Also Achievethecore.org. All of which explains why Honesty Gap uses much of the same rhetoric to highlight the data from the Achieve.org report.

[Update: Oh, wow. The full-scale product rollout includes a new hashtag #HonestyGap on twitter, where you can find all your favorite reformy hucksters tweeting about how parents deserve the truth!]

Man-- it's like the group is so loaded with money that every time they wan t to launch a new talking point, they give it its own glitzy website. Meanwhile, I am typing about it while eating my convenience store fiesta chicken wrap at lunch. It's an amazing world.

So what's the end game of this particular self-supporting PR blitz? Maybe the secret is here in the third of the Achieve report's "findings"--

A number of states have been working to address proficiency gaps; this year, even more will do so by administering the college- and career-ready-aligned Smarter Balanced and PARCC assessments.

The dream of a national assessment, a BS Test that waves its flag from shore to shore-- that dream still lives! See, states? You insisted on launching your own test and dropping out of PARCC/SBA and that's just cause you're lying liars who lie the giant big lies. Come back home to the warm bosom of a giant, national scale test!


Here's one funny thing about the Achieve report. There's a term that does turn up on the Honesty Gap website, but in twelve pages of the original Achieve report about being prepared and proficient etc etc, these words do not appear once-- Common Core.

It's funny. Even a year ago, I hated the Core pretty passionately. But I start to feel sorry for it-- given the need to choose between Core and charters, Core and political advantage, or Core and testing, people keep picking the Core last. Poor orphaned useless piece of junk.


Sunday, November 23, 2014

Core Ready Schools, Aspen and Achieve


So you want to know how your district is doing on the implementation of the Common Core? Well, the folks at Achieve and the Aspen Institute have a tool for you. It's Core Ready Schools, 
a handy tool for evaluating your school's progress in implementation that only misses one huge, gigantic, Uranus-sized indicator. But let me work up to it.

There is a whole 90-minute rollout presentation on video right here and I know I usually watch these things for you, but I couldn't quite get through all of it. But let me tell you about what I did get through, and if you actually want to watch the whole thing, drop me a note in the comments and let me know how it was. Because who knows-- it might not have been quite as mind-numbing as I began to fear it was.

The video opens with a nice lady from Aspen who covers a bunch of specs and screenshots about the-- well, she keeps calling it an app, but it appears to be a website. Also big thanks to the Bill and Melinda Gates foundation, and their Program Officier, which I infer is a person from the foundation who comes and works with you on your program so that you don't have to do that nasty application process, and for some reason I'm thinking of the Roman system of local governors, but maybe we should leave this for another day. What's this thing actually for? Well, it's not an accountability tool (I know because she said so). Let's bring up Mike Cohen from Achieve to talk.

Mike from Achieve talks about Achieve's Core cred and says "I feel like the Home Depot of the Common Core" Nobody laughed and he took that to mean that nobody got that joke at all."It's a tough crowd this morning"

Anyway, Achieve was concerned about a lack of data and tools to monitor implementation. They needed a way to get data on how implementation was going on state level. First tries they gave up as too hard. But then somehow we all realized that Aspen had already kind of done the work, with their handy transition guide for school leaders and so the Core Ready Schools app-site-tool covers similar ground.

Core Ready Schools ia aimed at things you would want to monitor, and that chiefs at CCSSO would commit to using. Something lightweight, but with depth. Balance of common across states but flexible enough for individual states. Here are the seven factors Mike says (the site calls them "levers") the tool is designed to consider.

1) Is leadership focusing on CCSS as part of school improvement
2) Is instruction being aligned with it
3) Is ongoing professional development supporting CCSS
4) Do you have an aligned assessment system?
5) Do you have aligned instruction resources and curriculum in school
6) Do you have mechanisms for engaging families and communities (because you're going to have to get them to buy into this, so by "engage" we seem to mean "talk to" and not "listen to.")
7) And are there sufficient resources and staffing (technology)

The tool is supposed to allow for different states' emphasis and ways to collect data. Mike tells a story about how one school chief was just going to ask superintendents how things were going and not dig any deeper. "Don't you think there will be inflation" "Yes, but then they'll have a harder time explaining results on assessments." So, give a principal enough rope? With this not-an-accountability tool?

Mike also says, "They desperately need it to know what's going on-- there's no debate about that." I would be happy to debate him. Also, though this started as Common Core thing, but they've been flexing it to handle states other CACR standards. Because we'd hate to get left behind when the Core is dumped.

Do you know what we haven't talked about?

I said there was a glaring omission. So far it appears that when we're assessing the success of our Common Core implication, we are not going to ask if the students in the schools seem to be getting a better education. That seems to be primarily because we assume that if Common Core is well-implemented, it will automatically lead to better test scores (what? is there some other way to measure how well children are being educated?) But no-- at no point in this entire process do we actually look at the affect of Core implementation on student learning.

Who is this for again?

This tool fits the whole reformster style because it assumes that superintendents simply can't know what is going on in their own districts, presumably because of some combination of stupidity and lying subordinates. Also, of course, information is far more informationny when it's in number form.

The big selling point here is that this tool will be useful inside of districts, helping leaders tell how well the implementation is going on inside the district. This skips over the question of whether we should implement CCSS in the first place, plus it skips over another question-- when school leaders are implementing a program because it has been mandated and they had no say in it, how much time to they spend worrying about implementing it well? Or, on the deeper philosophical level, how much commitment to doing a bad thing well is a good amount of commitment to doing a bad thing well. Or, if you prefer classic filmic references, exactly whom should we be rooting for in Bridge on the River Kwai?

Never mind that for a moment, because I'd like to offer for your consideration the user agreement from the Core Ready Schools website:

By clicking the button below, you agree to have your anonymized survey results recorded by Anabliss Design + Brand Strategy and shared with the Aspen Institute. The Aspen Institute reserves the right to utilize the data in research, analysis, and reporting on the implementation of the CCSS and other education-related trends; however, the Aspen Institute agrees that any data disclosed will be anonymized data that is not tied to specific users and is not released in any manner that could identify an individual, school, or school district.

So, NOT just collecting data for your district. You're also collecting data for Aspen and their friends. You are volunteering for a walk-on roll for the next production of "How the Implementation of Common Core Is Going in Our Schools." Yes, it's one more great chance to do free grunt work for our Data Overlords.

More fun with websites

One cool things about Core Ready Schools? Anybody can log in and create an account. You could, for instance, sign in and start the account for your school or district; I did that, and I'm sure my superintendent will be calling to thank me if she ever hears about it. I suppose you could log in and start an account for any school-- even fake ones-- although if a lot of people did that, it might make Aspen's aggregated numbers less accurate, and that would be a shame, I imagine.

You're allowed to take the survey ten times over five years. I found the questions simple and the interface easy to navigate. There are just a handful of self-assessment questions for each "lever," and most of them are unexceptional. The program occasionally reveals its blind spots. One of the questions about instruction asks about how well teachers understand and use the Core, and it does not allow for the possibility that teachers are familiar with the Core but don't use it because they don't want to. Everything in the survey assumes that we all want to welcome the Core into our home and make it happy here and that its success will naturally flow into educational awesomeness and joy for all. There is no "You're not my real mom" option.

The whole effect is very Borgian, and it reveals an extremely specific view of exactly how a school should be assimilated into the Core universe. This is no surprise. Since the Core is a one-size-fits-all prescription for students, why would it not come with a one-size-fits-all school districts to implement it? Yes, Aspen is promising customizable versions of this tool (for a price), but this is customizable in the same manner as a fast food burger-- you can change the balance of the elements a bit, but you'll be choosing how to tweak the ingredients that the restaurant has chosen for you. So, not very customizable at all (kind of like that "personalized" education we keep hearing about).

So if your school district decides to sign on to this handy tool, God bless you and have fun. Thank you for making a contribution to the giant holding cells of our Data Overlords.And remember-- student learning is irrelevant to the process.