In Pennsylvania, we rate schools with the School Performance Profile (SPP). Now a new research report reveals that the SPP is pretty much just a means of converting test scores into a school rating. This has huge implications for all teachers in PA because our teacher evaluations include the SPP for the school at which we teach.
Research for Action, a Philly-based education research group, just released its new brief, "Pennsylvania'a School Performance Profile: Not the Sum of Its Parts." The short version of its findings are pretty stark and not very encouraging--
90% of the SPP is directly based on test results.
90%.
SPP is our answer to the USED waiver requirement for a test-based school-level student achievement report. It replaces the old Adequate Yearly Progress of NCLB days by supposedly considering student growth instead of simple raw scores. It rates schools on a scale of 0-100, with 70 or above considered "passing." In addition to being used to rate schools and teachers, SPP's get trotted out any time someone wants to make a political argument about failing schools.
RFA was particularly interested in looking at the degree to which SPP actually reflects poverty level, and their introduction includes this sentence:
Studies both in the United States and internationally have established a consistent, negative link between poverty and student outcomes on standardized tests, and found that this relationship has become stronger in recent years.
Emphasis mine. But let's move on.
SPP is put together from a variety of calculations performed on test scores. Five of the six-- which account for 90% of the score-- "rely entirely on test scores."
Our analysis finds that this reliance on test scores, despite the partial use of growth measures, results in a school rating system that favors more advantaged schools.
Emphasis theirs.
The brief opens with a consideration of the correlation of SPP to poverty. I suggest you go look at the graph for yourself, but I will tell you that you don't need any statistics background at all to see the clear correlation between poverty and a lower SPP. And as we break down the elements of the SPP, it's easy to see why the correlation is there.
Indicators of Academic Achievement (40%)
Forty percent of the school's SPP comes from a proficiency rating (aka just plain straight on test results) that comes from tested subjects, third grade read, and the SAT/ACT College Ready Benchmark. Whether we're talking third grade reading or high school Keystone exams, "performance declines as poverty increases."*
Out of 2,200 schools sampled, 187 had proficiency ratings higher than 90, and only seven of those had more than 50% economically disadvantaged enrollment. Five of those were Philly magnet schools.
Indicators of Academic Growth aka PVAAS (40%)
PVAAS is our version of a VAM rating, in which we compare actual student performance to the performance of imaginary students in an alternate neutral universe run through a magical formula that corrects for everything in the world except teacher influence. It is junk science.
RFA found that while the correlation with poverty was still there, when it came to PSSAs (our elementary test) it was not quite as strong as the proficiency correlation. For the Keystones, writing and science tests, however, the correlation with poverty is, well, robust. Strong. Undeniable. Among other things, this means that you can blunt the impact of Keystone test results by getting some PSSA test-takers under the same roof. Time to start that 5-9 middle school!!
Closing the Achievement Gap (10%)
This particular measure has a built-in penalty for low-achieving schools (aka high poverty schools-- see above). Basically, you've got six years to close half the proficiency gap between where you are and 100%. If you have 50% proficiency, you've got six years to hit 75%. If you have 60%, you have six years to hit 80%. The lower you are, the more students you must drag over the test score finish line.
That last 10%, incidentally, is items like graduation rate and attendance rate. Pennsylvania also gives you points for the number of students you can convince to buy the products and services of the College Board, including AP stuff and PSAT. So kudos to the College Board people on superior product placement. Remember kids-- give your money to the College Board. It's the law!
Bottom line-- we have schools in PA being judged directly on test performance, and we have data once again clearly showing that the state could save a ton of money by simply issuing school ratings based on the income level of students.
For those who want to complain, "How dare you say those poor kids can't achieve," I'll add this. We aren't measuring whether poor kids can achieve, learn, accomplish great things, or grow up to be exemplary adults-- there is no disputing that they can do all those things. But we aren't measuring that. We are measuring how well they do on a crappy standardized test, and the fact that poverty correlates with results on that crappy test should be a screaming red siren that the crappy test is not measuring what people claim it measures.
*Correction: I had originally include a mistyping here that reversed the meaning of the study.
Showing posts with label Research for Action. Show all posts
Showing posts with label Research for Action. Show all posts
Friday, March 13, 2015
Tuesday, November 18, 2014
PA Cyber Charters Failing
Let me be clear up front-- I reject the use of standardized tests to measure the education of students, the effectiveness of teachers, and the quality of schools. The best predictor of student results on standardized test remains their socio-economic class.
Oh, no, wait. In Pennsylvania, there is apparently one other factor that can predict how poorly a student will do on the Big Test.
Whether or not he attends a cyber charter.
Here's a chart that tells the story
SPP stands for School Performance Profile, the data that the state keeps to judge the schools within the Commonwealth. The "quality threshhold" is 70 or higher.
Research for Action did some number crunching for a report about charter performance in PA.
The validity of SPP as a measure of school quality is suspect due to its heavy reliance on test scores that are highly correlated with socioeconomic characteristics of students that are beyond the control of schools. Indeed, RFA’s analysis of the 2012-13 SPP scores revealed that SPP scores were heavily correlated with the percentage of economically disadvantaged students in a school building.
And yet, cyber charters still couldn't outperform the brick-and-mortar schools-- even those serving large low-income populations. And cyber-charters low-income population percentages are lower than even the charter school numbers.
The RFA report notes that while cyber-charters hit low percentages for English Language Learners, they have grown significantly in special education population. Perhaps that is related to the fact that cyber charters get about $8K for regular students and $23K for special education students. Do you suppose that has affected their marketing and recruiting at all?
Data is no longer available on turnover rates at cybers in PA, but the RFA report concludes that given cyber-charter's consistently terrible performance by the state's standards, the state would be crazy to authorize further cyber charter operations in the state. As I said, I reject SPP as a true measure of education quality, but if that's the rule we have to play by, then cyber charters have earned a severe benching.
The report is a good one, with more charts for those who like such things. Follow the link and read the whole thing.
Oh, no, wait. In Pennsylvania, there is apparently one other factor that can predict how poorly a student will do on the Big Test.
Whether or not he attends a cyber charter.
Here's a chart that tells the story
SPP stands for School Performance Profile, the data that the state keeps to judge the schools within the Commonwealth. The "quality threshhold" is 70 or higher.
Research for Action did some number crunching for a report about charter performance in PA.
The validity of SPP as a measure of school quality is suspect due to its heavy reliance on test scores that are highly correlated with socioeconomic characteristics of students that are beyond the control of schools. Indeed, RFA’s analysis of the 2012-13 SPP scores revealed that SPP scores were heavily correlated with the percentage of economically disadvantaged students in a school building.
And yet, cyber charters still couldn't outperform the brick-and-mortar schools-- even those serving large low-income populations. And cyber-charters low-income population percentages are lower than even the charter school numbers.
The RFA report notes that while cyber-charters hit low percentages for English Language Learners, they have grown significantly in special education population. Perhaps that is related to the fact that cyber charters get about $8K for regular students and $23K for special education students. Do you suppose that has affected their marketing and recruiting at all?
Data is no longer available on turnover rates at cybers in PA, but the RFA report concludes that given cyber-charter's consistently terrible performance by the state's standards, the state would be crazy to authorize further cyber charter operations in the state. As I said, I reject SPP as a true measure of education quality, but if that's the rule we have to play by, then cyber charters have earned a severe benching.
The report is a good one, with more charts for those who like such things. Follow the link and read the whole thing.
Subscribe to:
Posts (Atom)