Pages

Sunday, January 29, 2017

OR: Protecting the Tests

Oregon has a law-- House Bill 2713-- that directs their Secretary of State to conduct an audit of "use of statewide summative assessment in public schools in this state." It's an audacious, wacky move-- don't just implement the Big Standardized Test, but actually check back and do some studies to see if it's a big waste of money or not.


The audit was actually released back in September of 2016, to what appears to be not very much fanfare or attention. I ran across it only because of an op-ed published earlier this week.So this is definitely Not Breaking News. But I'm always intrigued when  a state actually bothers to see if their reformy measures are doing any good or not, and Oregon has just started out with the Smarter Balanced Assessment folks, so I've decided to take a look at the report.

Here are some of the findings:

The new tests are more expensive. In 2013-2104, Oregon shelled out $5.2 million to run the statewide Big Standardized Test. In 2014-2015, that leapt up to $10.2 million.$8.2 million was for the test and the scoring thereof. $1.8 was for "membership fees." Who knew that belonging to the Smarter Balanced Assessment Consortium was like belonging to a really fancy country club? Also, the assertion is out there that this is a lowball-- it does not account for the inhouse costs for Department of Ed supervision and administration of the test.

The audit declares that "statewide results are a measure of school performance" They say that "organizations" that use test results to "facilitate learning and improvement" can "deliver better outcomes." This is all part of using "measurement information" as part of a "broader performance management framework," and a lot of other baloney that come straight from corporate management consultant boilerplate.

But the audit noted that some people have concerns about the testing, like " how certain student populations experience the test." Or all the time lost to testing. Many of the folks surveyed had some thoughts about how to improve the whole business.Yeah, I'll bet they did.

Having said that, the audit goes on to say that "the test benefits and purposes are not always clear." People who thought they knew what the test is for gave conflicting answers. Parents want to know what the test is for. Teachers want to know why they're mandated to give a test that has no use for the classroom.

Can you guess what the audit's response to that widespread understanding that the BS Test is a purposeless waste? Of course you can-- "The department could clarify its message about the purpose of the test and take a more active communications role." We saw the same thing back when Common Core was still fighting for its zombie half-life. When your product is a dud and everyone is telling you it's a dud and it's proving it's a dud by failing to do any of the things you said it was going to do, why, then you have a PR problem, and you just need to sell your dud of a product harder.

Oh, and then there's this finding:

Smarter Balanced results are not consistently used in ways that provide clear benefits to everyone. 

Yes, we remember this from Common Core as well-- the product is great but you're implementing it all wrong.

Survey respondents identified current and potential limitations to using data, such as untimely results, uncertainty about how to use results, different skill levels in interpreting data, and a lack of complimentary resources. 

The committee forgot to include "the data does not actually represent any useful insights into student knowledge or instruction."

The report slips in one suggestion-- perhaps we need more assessment. "Comprehensive assessment systems" would provide more data, and therefor be more wonderful. We could throw in common state-level formative and interim assessments on top of the summative ones, and just standardized test the little buggers all the time. In fact, the SBA folks offer just such a larger testing package, just in case you're worried they're not getting enough Oregon tax dollars yet.

The audit also notes that a lot of folks think the test receives "too much emphasis." That may be because some folks feel there "are not clear benefits to the students and educators most affected by the test..."

Our next subheading signals that the audit committee will now drive directly toward the weeds.

The test demands more time and depth of knowledge

And once we've set our weedward course, the audit can start saying foolish things like "Because it assesses critical thinking and problem‐solving skills required by the Common Core State Standards..." which a clause without a single True Thing in it. The test does not assess critical thinking and problem-solving skills and neither does any other standardized test out there. That's okay, because the Common Core does not require any of those higher order thinky skills, anyway.

The test does require a bunch of time, though. For ELA testing, most grades require more than three hours, while the math test takes up five or six. Would we make it longer?

Understandably, with so much time invested in the test, many are interested in receiving individual students’ results. In order to offer those results in detail, the test must ask more questions of each student, making it longer. A shorter test, focused solely on the health of the system, would provide less precise individual results.

Got that. The SBA test cannot tell you anything valuable about your individual student, meaning that all the bunk about using the test to determine whether your child is on track for college, or teachers using it as a diagnostic test to see what Chris needs to be taught-- that was all, in fact, bunk. The test we've got actually focuses "on the health of the system" which means God knows what.

Anyway, parents stop asking for meaningful results for your specific kid. It's not going to happen. Quick-- somebody get a PR guy in here.

There were plenty of challenges with test administration. Turns out that when every student has to take the BS Test on a computer but you only have so many computers in the building that work it all turns into a huge mess that even better PR wouldn't solve. One quoted administrator noted that the school computer lab was tied up with testing from March through June. Notes the audit wryly, "We heard that having at least one computer for every student can be helpful."

Also test preparation and administration may have reduced available instruction time. May have. Or instructional time may have been reduced by localized time dilation fields. Or test preparation and administration actually increased instruction time by unleashing the power of black holes. Or maybe SBA tests come with a free time turner. May have??

Hi there! I'm your new testing administrator.


Schools do not always understand test administration guidance or have access to information about best practices.

Well, actually, as I read through the explanation of this section, it doesn't seem like an "understanding" problem so much as a "thinking some of these directives are stupid" problem. Nobody may enter a testing room. Teacher interaction with children testing is limited to "do your best."

Also, the audit has forgotten what it said a page or two back, because this:

The department sets requirements for secure and valid testing to ensure that each student has a fair opportunity to demonstrate his or her abilities... 

We established earlier that these tests will not measure individual student ability. The second half of the sentence reminds us that the school will be judged (and rated and punished) based on these results, and the state wants to be sure they catch all the schools that need to be punished. So let those kids suffer.

The audit also notes that some districts are better prepared than others to take the test. But wait-- isn't the test supposed to measure the school's educational achievement? Why should "prepared for the test" even be a factor-- shouldn't the mere fact of being well-educated be enough to prepare students for a well-designed test?

The audit gives a whole big subheading to Some student populations may experience more negative impacts than others.

For instance, Title I schools (aka poor schools) report they lose a lot of instruction time while trying to do test prep. Also, in high schools, students have the option of doing portfolios to meet Oregon's Essential Skills requirement, which means those students need to take the BS Test like a fish needs a high-powered Harley-Davison.

The audit acknowledges that some students will not be accurately measured by the test, including English Language Learners and students with special needs. You can sit a blind student down at a computer with no modifications to click on answers she can't see, or you can force a student who barely speaks English to take a test in English, but if you think their results tell you anything real, you're delusional. This applies to the student who is functioning below grade level but who must be tested at grade level. This also goes for all the students who are disinclined to bother to try at all on the BS Test.

In other words, a lot of your test result data is junk.

The audit nods to the idea that BS Tests can't fulfill their beloved PR talking point purpose of identifying underserved schools and communities if the data doesn't actually mean anything.

The report comes with a page of recommendations at the end. There are twelve.

1) More PR
2) Check which PR is working and do more of that.
3) Also, more PR directed at parents
4) More guidance for schools on how to use the damn things and PR to make them happier
5) Badger company to get results back faster
6) Look for other data to combine test data with
7) Add more standardized tests more often
8) Badger company to fix all the technical issues
9) Give better advice from state, because that will totally help with scheduling and facilities
10) Actually put in place feedback system so people can let us know about problems
11) Share happy stories about any place this stuff is actually doing some good (aka more PR)
12) Look for "opportunities to reduce individual impacts," or something.

Finally they get to methodology. Many reports like to do this, which is a pain if you're not experienced in this kind of fluffernuttery because you read through forty pages only to discover at the end that the "report" was generated by a chimpanzee using a Ouija board.

This report is a little bit better than the chimp method. Mostly they surveyed people. They surveyed lots of groups, including groups connected to ethnic groups (e.g. Asian Pacific American Network of Oregon), issue-specific groups (e.g. Disability Rights Oregon), the Oregon Education Association, some parent groups, and some reformster outfits that there's no good reason to include (e.g. Stand for Children). Mostly, they involved groups who do not have a track record of really challenging the test, or who are not normally players in the education biz. They interviewed people at the Oregon Department of Education, a department committed to the testing program, so I'm sure they were all about an objective look. They talked to the bosses at Smarter Balanced. They visited six whole public schools. Six! Way to get out there and see how things look on the ground, folks.

The survey was distributed to regular teachers and parents through a government mailing list, the Oregon PTA newsletter and facebook page. Which means that non-parental taxpayers were completely skipped here. 799 parents, a few hundred administrators, and some teachers responded, and the audit concedes there's a response bias built in.

But mostly the report seems to have been built to rearrange a few deck chairs without ever questioning the course of the SS Standardized Test. It's one more example of how to "examine" the testing program and conclude that everything is actually just fine, no large changes needed, maybe increase the PR budget, and do even more testing. The audit does mention that Oregon parents have the choice to opt out of the test; they might want to remember that in a few months.






1 comment:

  1. "The test demands more time and depth of knowledge." In NJ, my 8th graders will be taking an End of Course math test the last week of March. We are in school until June 23rd. Wish I could demand more time - like the 3 months until school ends - to increase my students' depth of knowledge. What a waste of millions of dollars.

    ReplyDelete