It will be a quickie this week-- I have both of my children home and a grandson's birthday party to attend!
Eva Moskowitz Cannot Help Herself
Daniel Katz provides one of the best overviews of Moskowitz's ongoing meltdown. A study in how privilege, money and power can make you blind to how you're behavior is playing in the real world.
How Twisted Early Childhood Education Has Become
Early childhood ed has arguably been more badly damaged by reformsters than any other segment of the education biz. Sometimes it helps to have someone take a step back, show how far off track we have gotten, and help you realize you're not crazy for thinking we're getting early childhood ed completely wrong at this point.
Competency Based Ed: The Culmination of the Common Core Agenda
A good collection of the many pieces and points of view springing up as CBE becomes the newest topic of the education debates.
Five Perspectives on Student Fragility
At Psychology Today, Peter Gray has been running a series about the increasing fragile nature of our students, including theories about the source. This latest installment is interesting because it includes the many, many reactions from various stakeholders in that discussion.
Are You Being Served?
Nobody combines humor and actual journalism better than Jennifer Berkshire at Edushyster. Here's a look at the facts of which students Boston charter schools are actually serving.
Sunday, November 29, 2015
Saturday, November 28, 2015
Remember Outcome Based Education?
Because of massive technological, economic and social changes, we are challenged to boost standards of student performance substantially, especially among those who in the past were least successful. The educational sector apparently will not have more money, so we cannot expect salaries to be more attractive or other resources more plentiful. The alternative, say thoughtful observers, is to restructure.
That quote comes from Ron Brandt, the Executive Editor for the Association for Supervision and Curriculum Development. In 1994. It comes from Brandt's introduction to the book Outcome Based Education: Critical Issues and Answers, by William Spady. More about him in a moment.
If you are a teacher of a Certain Age, you remember Outcome Based Education. OBE started popping up in the US in the early 90s. While one of its features was a certain vagueness (Brandt wrote in an ASCD overview that "OBE is more of a philosophy than a uniform set of practices"). But now that Competency Based Education is auditioning for Educational Thing Du Jour, pulling out the OBE notebook seems apropos.
OBE attracted my attention when it first appeared because it sounded suspiciously like Management By Objectives, a management technique developed by Peter Drucker. Watching old insights from MBO appear in OBE was first led to my theory that when management consultants have finally saturated the business market, they go through their materials, cross out the biz buzz words, and pencil in education jargon and voila!-- they are back in business.
But if OBE was transmogrified from the business world, so what? Was it any good?
The central philosophical shift was to move from time-based schooling to objective based. in other words, the traditional constant in school is time, and the variable is learning. We only have 180 days-- how much can we get done in those days? OBE said, "Let's list what learning objectives we want the students to achieve, and time will be the variable."
The self-proclaimed father of OBE is the above-mentioned Bill Spady, a sociologist who started pioneering OBE in the mid-eighties. He became the director of the International Center on Outcome-Based Restructuring, and continues to work in education today. If you really want to know all about Spady, a John Anthony Hader wrote his dissertation about Spady and his work.
Spady was notoriously unwilling to give exact instructions for setting up OBE, insisting that objectives had to be locally developed. But he did lay down some guiding principles, some of which are listed here by his colleague Brandt.
* Clarity of focus. Your outcome has to be focused and specific.
* Design down, deliver up. Work backwards from your objective to design programs, but work toward the objective from wherever the students are.
* High expectations. Specifically (if we heard this once, we heard it a million times) believe that all students can learn all.
* Expanded opportunities. Provide students many chances and many ways to show they have achieved the objective.
Additionally, OBE acquired various corollaries, implications, and add-ons. If we were going to insist that all students can learn all, then we had better settle on objectives that all students can learn (let the dumbing down begin). For some reason, cooperative learning became closely tied to OBE in many regions. And the prospect of wreaking havoc with the school year-- headaches! If Chris can meet all objectives by Christmas, can Chris then go home? Or does Chris just start the next "grade"? And what if Chris is still not getting it in July-- does Chris's school year continue until the last objective is met? Logistically, how does that even work? And how do you write a teacher contract that says, "Depending on how well you do, you are hired for something between 100 and 300 days." Or do you just pay teachers for piecework ($100 per every student objective met)?
Objectives themselves were problematic. This was the dawn of TSWBAT (the student will be able to...) which meant that every single objective had to be paired with some observable student behavior. This has eternally been an educational challenge (did Chris learn to understand the Iliad, or did Chris figure out how to act like Chris understands). But OBE threw its weight on the side of observable behavior, encouraging teachers to require student performance rather than teacher inquiry to assess.
OBE caught on big time, until-- and I say this with both pride and shame-- Pennsylvania broke it.
Pennsylvania was poised to weave OBE into the warp and woof of state education regulation. Many of us went to professional development sessions to prepare us for the Big Shift. But instead, this time, shift never happened.
Some of it was not Pennsylvania's fault. The OBE fans had missed one of the implications of their own work, which was the the objectives would need to be clearly measurable. Instead, various versions of OBE were peppered with what we now call non-cognitive objectives. And not just non-cognitive, but politically charged as well. Here are some contributions to the genre:
All students understand and appreciate their worth as unique and capable individuals, and exhibit self-esteem.
All students apply the fundamentals of consumer behavior to managing available resources to provide for personal and family needs.
All students make environmentally sound decisions in their personal and civic lives.
OBE programs has a variety of objectives like these, and conservatives freaked. Rush Limbaugh, Bill Bennett, Pat Robertson and most especially Phyllis Schafly were sure that OBE was here to socially engineer your child into some bleeding heart gay-loving liberal twinkie.
OBE was also vulnerable because there wasn't a lick of evidence or research to indicate that it actually worked. And because it was focused on locally-selected objectives that could be met in a variety of ways, there wasn't even any way to tell if it was working at all.
Opponents were also taken aback by the electronic portfolio. OBE demanded a portfolio system in which the many and varied objective-meeting projects of students could be gathered, but then some computer-enamored mook decided that an electronic portfolio, that could be stored in perpetuity and could follow the students anywhere-- that would be cool! Is any of this starting to sound vaguely familiar?
And because Spady and his brethren refused to give specific instructions, OBE looked like a thousand different things, some of which seemed directly contradictory. In Pennsylvania, the initial version of OBE state education regs included roughly 550 objectives. According to Hader's oral history, Spady told them they were about 540 off; the education department rapidly backpedaled while begging Spady to come write the objectives for them. Then Peg Luksik activiated her formidable army or conservatives to attack OBE, and the whole business started to collapse. Pennsylvania broke OBE, and it never quite recovered.
When I started to hear about Performance/Competency Based Education, I initially thought that it would be the reheated leftovers of OBE. I cringed, because I remember the training and the insistence that all students can learn everything and the crazy barrage of ever-shifting state directives. Pennsylvania's OBE initiative came at the end of my first decade in the classroom, and it marked the point at which I suddenly realized that the policy leaders and educational bureaucrats on the state level might not know what the hell they were talking about. But I also remembered OBE's complete and utter collapse and thought, "Well, this will die quickly."
But CBE turns out to be a different sort of OBE, an OBE with its holes plugged by sweet, sweet technology and its foundation shored up withCommon Core college and career ready standards. Where OBE was all loosey goosey with whatever standards and objectives the locals wanted, CBE will help you get a list of standards/objectives already in place-- and some vendors will throw in the assessments and performance tasks and the software to measure them as well as recording the results as well as using those results to decide which pre-packaged lesson your student should do next.
Technology also aids in the variable-time logistics problem. Now, instead of puzzling over whether behind-on-objective Chris must stay in school through July, we can just get Chris to use internet connections to make the school day fourteen hours long. Of course, we still have the puzzle of what to do if Chris completes an entire grade level's worth of objectives over the weekend.
Technology also ups the ante on that electronic portfolio, the data backpack that will follow your student throughout life. Of course, in some schools that currently means that a teacher's primary function is endless data entry. But since the performance tasks are on the computer, the teacher will be spending far less time teaching anyway.
Most of all, technology underlines the classic problem with OBE-- the notion that education is just learning to perform a series of designated tasks, like a team on the Amazing Race. Education is just working your way down a checklist, and once everything on the list is checked off-- congratulations! You're an educated person! That's all it there is to it! Of course, that also takes us back to the problem that killed OBE the last time-- exactly who gets to decide which tasks go on that checklist?
As I've said, I have my doubts about CBE's chance to take over the education world. Its resemblance to OBE doesn't improve my estimation of its odds.
That quote comes from Ron Brandt, the Executive Editor for the Association for Supervision and Curriculum Development. In 1994. It comes from Brandt's introduction to the book Outcome Based Education: Critical Issues and Answers, by William Spady. More about him in a moment.
If you are a teacher of a Certain Age, you remember Outcome Based Education. OBE started popping up in the US in the early 90s. While one of its features was a certain vagueness (Brandt wrote in an ASCD overview that "OBE is more of a philosophy than a uniform set of practices"). But now that Competency Based Education is auditioning for Educational Thing Du Jour, pulling out the OBE notebook seems apropos.
OBE attracted my attention when it first appeared because it sounded suspiciously like Management By Objectives, a management technique developed by Peter Drucker. Watching old insights from MBO appear in OBE was first led to my theory that when management consultants have finally saturated the business market, they go through their materials, cross out the biz buzz words, and pencil in education jargon and voila!-- they are back in business.
But if OBE was transmogrified from the business world, so what? Was it any good?
The central philosophical shift was to move from time-based schooling to objective based. in other words, the traditional constant in school is time, and the variable is learning. We only have 180 days-- how much can we get done in those days? OBE said, "Let's list what learning objectives we want the students to achieve, and time will be the variable."
The self-proclaimed father of OBE is the above-mentioned Bill Spady, a sociologist who started pioneering OBE in the mid-eighties. He became the director of the International Center on Outcome-Based Restructuring, and continues to work in education today. If you really want to know all about Spady, a John Anthony Hader wrote his dissertation about Spady and his work.
Spady was notoriously unwilling to give exact instructions for setting up OBE, insisting that objectives had to be locally developed. But he did lay down some guiding principles, some of which are listed here by his colleague Brandt.
* Clarity of focus. Your outcome has to be focused and specific.
* Design down, deliver up. Work backwards from your objective to design programs, but work toward the objective from wherever the students are.
* High expectations. Specifically (if we heard this once, we heard it a million times) believe that all students can learn all.
* Expanded opportunities. Provide students many chances and many ways to show they have achieved the objective.
Additionally, OBE acquired various corollaries, implications, and add-ons. If we were going to insist that all students can learn all, then we had better settle on objectives that all students can learn (let the dumbing down begin). For some reason, cooperative learning became closely tied to OBE in many regions. And the prospect of wreaking havoc with the school year-- headaches! If Chris can meet all objectives by Christmas, can Chris then go home? Or does Chris just start the next "grade"? And what if Chris is still not getting it in July-- does Chris's school year continue until the last objective is met? Logistically, how does that even work? And how do you write a teacher contract that says, "Depending on how well you do, you are hired for something between 100 and 300 days." Or do you just pay teachers for piecework ($100 per every student objective met)?
Objectives themselves were problematic. This was the dawn of TSWBAT (the student will be able to...) which meant that every single objective had to be paired with some observable student behavior. This has eternally been an educational challenge (did Chris learn to understand the Iliad, or did Chris figure out how to act like Chris understands). But OBE threw its weight on the side of observable behavior, encouraging teachers to require student performance rather than teacher inquiry to assess.
OBE caught on big time, until-- and I say this with both pride and shame-- Pennsylvania broke it.
Pennsylvania was poised to weave OBE into the warp and woof of state education regulation. Many of us went to professional development sessions to prepare us for the Big Shift. But instead, this time, shift never happened.
Some of it was not Pennsylvania's fault. The OBE fans had missed one of the implications of their own work, which was the the objectives would need to be clearly measurable. Instead, various versions of OBE were peppered with what we now call non-cognitive objectives. And not just non-cognitive, but politically charged as well. Here are some contributions to the genre:
All students understand and appreciate their worth as unique and capable individuals, and exhibit self-esteem.
All students apply the fundamentals of consumer behavior to managing available resources to provide for personal and family needs.
All students make environmentally sound decisions in their personal and civic lives.
OBE programs has a variety of objectives like these, and conservatives freaked. Rush Limbaugh, Bill Bennett, Pat Robertson and most especially Phyllis Schafly were sure that OBE was here to socially engineer your child into some bleeding heart gay-loving liberal twinkie.
OBE was also vulnerable because there wasn't a lick of evidence or research to indicate that it actually worked. And because it was focused on locally-selected objectives that could be met in a variety of ways, there wasn't even any way to tell if it was working at all.
Opponents were also taken aback by the electronic portfolio. OBE demanded a portfolio system in which the many and varied objective-meeting projects of students could be gathered, but then some computer-enamored mook decided that an electronic portfolio, that could be stored in perpetuity and could follow the students anywhere-- that would be cool! Is any of this starting to sound vaguely familiar?
And because Spady and his brethren refused to give specific instructions, OBE looked like a thousand different things, some of which seemed directly contradictory. In Pennsylvania, the initial version of OBE state education regs included roughly 550 objectives. According to Hader's oral history, Spady told them they were about 540 off; the education department rapidly backpedaled while begging Spady to come write the objectives for them. Then Peg Luksik activiated her formidable army or conservatives to attack OBE, and the whole business started to collapse. Pennsylvania broke OBE, and it never quite recovered.
When I started to hear about Performance/Competency Based Education, I initially thought that it would be the reheated leftovers of OBE. I cringed, because I remember the training and the insistence that all students can learn everything and the crazy barrage of ever-shifting state directives. Pennsylvania's OBE initiative came at the end of my first decade in the classroom, and it marked the point at which I suddenly realized that the policy leaders and educational bureaucrats on the state level might not know what the hell they were talking about. But I also remembered OBE's complete and utter collapse and thought, "Well, this will die quickly."
But CBE turns out to be a different sort of OBE, an OBE with its holes plugged by sweet, sweet technology and its foundation shored up with
Technology also aids in the variable-time logistics problem. Now, instead of puzzling over whether behind-on-objective Chris must stay in school through July, we can just get Chris to use internet connections to make the school day fourteen hours long. Of course, we still have the puzzle of what to do if Chris completes an entire grade level's worth of objectives over the weekend.
Technology also ups the ante on that electronic portfolio, the data backpack that will follow your student throughout life. Of course, in some schools that currently means that a teacher's primary function is endless data entry. But since the performance tasks are on the computer, the teacher will be spending far less time teaching anyway.
Most of all, technology underlines the classic problem with OBE-- the notion that education is just learning to perform a series of designated tasks, like a team on the Amazing Race. Education is just working your way down a checklist, and once everything on the list is checked off-- congratulations! You're an educated person! That's all it there is to it! Of course, that also takes us back to the problem that killed OBE the last time-- exactly who gets to decide which tasks go on that checklist?
As I've said, I have my doubts about CBE's chance to take over the education world. Its resemblance to OBE doesn't improve my estimation of its odds.
Friday, November 27, 2015
Can Competency Based Education Be Stopped?
Over at StopCommonCoreNYS, you can find the most up-to-date cataloging of the analysis of, reaction to, and outcry over Competency Based Education.
Critics are correct in saying that CBE has been coming down the pike for a while. Pearson released an 88-page opus about the Assessment Renaissance almost a year ago (you can read much about it starting here). Critics noted way back in March of 2014 (okay, I'm the one who noted it) that Common Core standards could be better understood as data tags. And Knewton, Pearson's data-collecting wing, was explaining how it would all work back in 2012.
Every single thing a student does would be recorded, cataloged, tagged, bagged, and tossed into the bowels of the data mine, where computers will crunch data and spit out a "personalized" version of their pre-built educational program.
Right now seems like the opportune moment for selling this program, because it can be marketed as as an alternative to the Big Standardized Tests which have been crushed near to death under the wheel of public opinion. "We'll stop giving your children these stupid tests," the reformsters declare. "Just let us monitor every single thing they do every day of the year."
It's not that I don't think CBE is a terrible idea-- I do. And it's not that I don't have a healthy respect for and fear of this next wave of reformy nonsense. But I can't shake the feeling that while reformsters think they have come up with the next generation iPhone, they're actually trying to sell us a quadrophonic laser disc player.
From a sales perspective, CBE has several huge problems
Been There, Done That
Teaching machines first cropped up in the twenties, running multiple choice questions and BF Skinner-flavored drill. Ever since, the teaching machine concept has kept popping up with regularity, using whatever technology was at hand to enact the notion that students can be programmed to be educated just like a rat can be programmed to run a maze.
Remember when teaching machines caught on and swept the nation because they provided educational results that parents and students loved? Yeah, neither does anybody else, because it never happened. The teaching machine concept has been tried, each time accompanied with a chorus of technocrats saying, "Well, the last time we couldn't collect and act on enough data, but now we've solved that problem."
Well, that was never the problem. The problem is that students aren't lab rats and education isn't about learning to run a maze. The most recent iteration of this sad, cramped view of humans and education was the Rocketship Academy chain, a school system built on strapping students to screens that would collect data and deliver personalized learning. They were going to change the whole educational world. And then they didn't.
Point is, we've been trying variations on this scheme for almost 100 years, and it has never caught on. It has never won broad support. It has never been a hit.
Uncle Sam's Big Fat Brotherly Hands
Remember how inBloom had to throw up its hands in defeat because the parents of New York State would not stand for the extensive, unsecured and uncontrolled data mining of their children. inBloom tried to swear that the kind of data mining and privacy violation and unmonitored data sharing that parents feared just wouldn't happen on their watch. But the CBE sales pitch doesn't just refuse to protect students against extensively collected and widely shared data mining-- CBE claims the data grubbing is not only not a danger, but is actually a valued feature of the program.
The people who thought inBloom was a violation of privacy and the people that thought Common Core was a gross federal overreach-- those people haven't suddenly disappeared. Not only that, but when those earlier assaults on education happened people were uneducated and unorganized-- they didn't yet fully grasp what was actually happening and they didn't have any organizations or other aggrieved folks to reach out to. Now all the networks and homework are already done and in place.
I don't envision folks watching CBE's big data-grabbing minions coming to town and greeting them as liberators. CBE is more of what many many many people already oppose.
No Successes To Speak Of
This has always been a problem for reformsters. "Give me that straw," they say, "and I will spin it into gold." They've had a chance to prove themselves with every combination of programs they could ask for, and they have no successes to point to. Remember all those cool things Common Core would accomplish? Or the magic of national standardized testing? The only people who have made a respectable job of touting success are the charteristas-- and that's not because they've actually been successful, but because they've mustered enough anecdotes and data points to cobble together effective marketing. It's lies, but it's effective.
Everything else? Bupkus. This will be no different. CBE will be piloted somewhere, and it will fail. It will fail because its foundation combines ignorance of what education is, how education works, and how human beings work.
Anchored to What?
A CBE system needs to be linked to some sort of national standards, but only those who have been very well paid have a deep commitment to them are still even speaking the name of Common Core. To bag and tag a nation's worth of data, you must have common tags. But we've already allowed states to drift off into their own definitions of success, their own tests, their own benchmarks. Saying, "Hey, let's all get on the same page" is not quite as compelling as it once was, because we've tried it and it sucked. As the probably successor to ESEA says, centralized standardization of education is not a winning stance these days. So to what will the CBE be anchored?
Expensive As Hell
Remember how expensive it was to buy all new books and get enough computers so that every kid could take a BS Test? You can bet that taxpayers do. Those would be the same taxpayers who saw programs and teachers cut from their schools even as there was money, somehow, for expensive but unnecessary new texts and computers (which in some cases could be used only for testing).
When policy makers announce, "Yeah, here's all the stuff you need to buy in order to get with the CBE program," taxpayers are going to have words to say, and they won't be happy, sweet words.
If every single worksheet, test, daily assessment, check for understanding, etc is going to go through the computer, that means tons of data entry OR tons of materials on the computers, through the network, etc etc etc. The kind of IT system required by a CBE system would be daunting to many network IT guys in the private sector (all of whom are getting paid way more than a school district's IT department). It will be time-consuming, buggy, and consequently costly.
Who wants to be the superintendent who has to say, "We're cutting more music and language programs because we need the money to make sure that every piece of work your child does is recorded in a central data base." Not I.
Program Fatigue
For the first time, the general taxpaying public may really get what teachers are feeling when they roll their eyes and say, "A NEW program? Even though we haven't really finished setting up the old one?!"
Bottom Line
I think that CBE is bad education and it needs to be opposed at every turn. But I also think that reformsters are severely miscalculating just how hard a sell it's going to be. We can help make it difficult by educating the public.
There will be problems. In particular, CBE will be a windfall for the charter industry if they play their cards right. The new administration will play a role in marketing this and I see no reason to imagine that any of the candidates won't help market this if they win. (Well, Sanders might stand up to the corporate grabbiness of it, and Trump will just blow up all the schools.)
But there will be huge challenges for the folks who want to sell us this Grade C War Surplus Baloney. It's more of a product that nobody wanted in the first place. We just have to keep reminding them why they didn't like it.
Critics are correct in saying that CBE has been coming down the pike for a while. Pearson released an 88-page opus about the Assessment Renaissance almost a year ago (you can read much about it starting here). Critics noted way back in March of 2014 (okay, I'm the one who noted it) that Common Core standards could be better understood as data tags. And Knewton, Pearson's data-collecting wing, was explaining how it would all work back in 2012.
Every single thing a student does would be recorded, cataloged, tagged, bagged, and tossed into the bowels of the data mine, where computers will crunch data and spit out a "personalized" version of their pre-built educational program.
Right now seems like the opportune moment for selling this program, because it can be marketed as as an alternative to the Big Standardized Tests which have been crushed near to death under the wheel of public opinion. "We'll stop giving your children these stupid tests," the reformsters declare. "Just let us monitor every single thing they do every day of the year."
It's not that I don't think CBE is a terrible idea-- I do. And it's not that I don't have a healthy respect for and fear of this next wave of reformy nonsense. But I can't shake the feeling that while reformsters think they have come up with the next generation iPhone, they're actually trying to sell us a quadrophonic laser disc player.
From a sales perspective, CBE has several huge problems
Been There, Done That
Teaching machines first cropped up in the twenties, running multiple choice questions and BF Skinner-flavored drill. Ever since, the teaching machine concept has kept popping up with regularity, using whatever technology was at hand to enact the notion that students can be programmed to be educated just like a rat can be programmed to run a maze.
Remember when teaching machines caught on and swept the nation because they provided educational results that parents and students loved? Yeah, neither does anybody else, because it never happened. The teaching machine concept has been tried, each time accompanied with a chorus of technocrats saying, "Well, the last time we couldn't collect and act on enough data, but now we've solved that problem."
Well, that was never the problem. The problem is that students aren't lab rats and education isn't about learning to run a maze. The most recent iteration of this sad, cramped view of humans and education was the Rocketship Academy chain, a school system built on strapping students to screens that would collect data and deliver personalized learning. They were going to change the whole educational world. And then they didn't.
Point is, we've been trying variations on this scheme for almost 100 years, and it has never caught on. It has never won broad support. It has never been a hit.
Uncle Sam's Big Fat Brotherly Hands
Remember how inBloom had to throw up its hands in defeat because the parents of New York State would not stand for the extensive, unsecured and uncontrolled data mining of their children. inBloom tried to swear that the kind of data mining and privacy violation and unmonitored data sharing that parents feared just wouldn't happen on their watch. But the CBE sales pitch doesn't just refuse to protect students against extensively collected and widely shared data mining-- CBE claims the data grubbing is not only not a danger, but is actually a valued feature of the program.
The people who thought inBloom was a violation of privacy and the people that thought Common Core was a gross federal overreach-- those people haven't suddenly disappeared. Not only that, but when those earlier assaults on education happened people were uneducated and unorganized-- they didn't yet fully grasp what was actually happening and they didn't have any organizations or other aggrieved folks to reach out to. Now all the networks and homework are already done and in place.
I don't envision folks watching CBE's big data-grabbing minions coming to town and greeting them as liberators. CBE is more of what many many many people already oppose.
No Successes To Speak Of
This has always been a problem for reformsters. "Give me that straw," they say, "and I will spin it into gold." They've had a chance to prove themselves with every combination of programs they could ask for, and they have no successes to point to. Remember all those cool things Common Core would accomplish? Or the magic of national standardized testing? The only people who have made a respectable job of touting success are the charteristas-- and that's not because they've actually been successful, but because they've mustered enough anecdotes and data points to cobble together effective marketing. It's lies, but it's effective.
Everything else? Bupkus. This will be no different. CBE will be piloted somewhere, and it will fail. It will fail because its foundation combines ignorance of what education is, how education works, and how human beings work.
Anchored to What?
A CBE system needs to be linked to some sort of national standards, but only those who
Expensive As Hell
Remember how expensive it was to buy all new books and get enough computers so that every kid could take a BS Test? You can bet that taxpayers do. Those would be the same taxpayers who saw programs and teachers cut from their schools even as there was money, somehow, for expensive but unnecessary new texts and computers (which in some cases could be used only for testing).
When policy makers announce, "Yeah, here's all the stuff you need to buy in order to get with the CBE program," taxpayers are going to have words to say, and they won't be happy, sweet words.
If every single worksheet, test, daily assessment, check for understanding, etc is going to go through the computer, that means tons of data entry OR tons of materials on the computers, through the network, etc etc etc. The kind of IT system required by a CBE system would be daunting to many network IT guys in the private sector (all of whom are getting paid way more than a school district's IT department). It will be time-consuming, buggy, and consequently costly.
Who wants to be the superintendent who has to say, "We're cutting more music and language programs because we need the money to make sure that every piece of work your child does is recorded in a central data base." Not I.
Program Fatigue
For the first time, the general taxpaying public may really get what teachers are feeling when they roll their eyes and say, "A NEW program? Even though we haven't really finished setting up the old one?!"
Bottom Line
I think that CBE is bad education and it needs to be opposed at every turn. But I also think that reformsters are severely miscalculating just how hard a sell it's going to be. We can help make it difficult by educating the public.
There will be problems. In particular, CBE will be a windfall for the charter industry if they play their cards right. The new administration will play a role in marketing this and I see no reason to imagine that any of the candidates won't help market this if they win. (Well, Sanders might stand up to the corporate grabbiness of it, and Trump will just blow up all the schools.)
But there will be huge challenges for the folks who want to sell us this Grade C War Surplus Baloney. It's more of a product that nobody wanted in the first place. We just have to keep reminding them why they didn't like it.
Is the Teacher Shortage Real?
We talk a lot about the current teacher shortage. I've posted about it numerous times. But the question remains-- is there really a teacher shortage?
A study released this month by the National Center for Educational Statistics suggests that everything we think we know about the Great Teacher Shortage is wrong. Or at least, it was wrong as of four years ago. The study is pretty straightforward, and it's worth making a note of.
The writers are Nat Malkus of the American Institutes of Research with Kathleen Mulvaney Hoyer and Dinah Sparks of Activate Research, Inc. AIR is also in the test manufacturing biz (SBA is their baby) and Activate is a "woman-owned small business" in the metro DC area focusing on research and policy. They created the report under the aegis of NCES, an arm of the USED Institute of Educational Sciences, so while none of these are without blemish, this is not another Gates-funded fake research project.
The report looks at four samples of data from the 1999-2000, 2003-2004, 2007-2008 and 2011-2012 school years, and it looked for answers to fairly straightforward questions:
1) What percentage of schools reported teaching vacancies or hard-to-fill spots?
2) What percentage of schools found these positions related to particular subject areas?
3) Did persistent hard-to-fill spots correlate to any school characteristics?
The report is easy to read through and contains lots of charts, but the answers reached by the researchers are not necessarily what we might expect. Let me just hit the highlights.
The percentage of schools reporting vacancies and hard-to-fill spots in 2011-2012 was down from 1999-2000. In fact it's the lowest of the four years.
Those percentages were also uniformly down for all subject areas, including math and special ed.
High minority schools still experience more staff challenges than low-minority schools, but 2011-2012 was still dramatically lower.
Title I schools have a harder time than non-Title I schools, but 2011-2012 was still better than all other years (I'm just going to write "pickle" every time this is the case, to save myself some typing.)
Large schools have it harder than small schools, but pickle.
When comparing city, suburban, town and rural schools, the most staff challenged schools have shifted over the four year. Cities used to lead the challenge with rural schools having the lowest percentage of staff challenged schools. In 2011-2012, suburban schools reported the fewest problems. Cities still has the most, but in all four categories, pickle. Big pickle.
I don't know what explains the pickle, and to their credit, the reports writers take a stance of, "We're just here to show you the numbers, not to make wild-ass guesses about why the numbers are what they are." The appendices give some number breakdowns and report on methodology, and while I am no trained stats cruncher, I don't see anything that sets off whopping alarms.
So am I thinking that I'll just stand down because the teacher shortage turns out to be all in my head? No. No, I'm not.
First of all, I have a certain amount of trust in my head, so I don't just throw away my head's ideas willy-nilly. I am , however, open to the notion that the teacher shortage is partly an artifact of the media's tendency to focus on a story thread and magnify it (e.g. the great shark summer of 2001).
In Pennsylvania, I know exactly why the numbers would reflect a not-shortage of teachers-- we've been shedding jobs left and right, dropping 2000-5000 teacher jobs (depending on who's counting) every year for several years. This is doing a great job of setting the stage for a teacher shortage, as college students repeatedly declare a major in Anything But Teaching. The ABT major is actually leading some college ed departments to shrink or collapse. The choking off of the teacher pipeline sets the stage for a combination of overcrowded classrooms and an actual teacher shortage.
My reading of teacher shortage bulletins is that teacher shortages are highly localized, and while the study's sampling of around 8,000 districts would ordinarily be plenty, I have to believe that the specific samples could make a huge difference.
But mostly what these results say to me is, "Holy smokes! We have plunged into a bad place very quickly over the last four years!"
Take for instance Scott Walker's Wisconsin. Here's a piece that lists the growing effects of Walker's gutting of the state's education system-- from November of 2011. In other words, the most recent data sampling in the study was being gathered just as Wisconsin schools were starting to feel the crunch. Quick quiz: have things gotten better or worse in Wisconsin since 2011?
Or North Carolina, another state that moved rapidly from a progressive education-supporting agenda to a state intent on driving teachers out.
Over the past four years, things have gotten far worse pretty quickly in schools across the country, from Race to the Top to Common Core testing. And in 2011, schools were seeing the last of federal stimulus money that allowed schools to keep hiring. When the stimulus money ran out, many districts starting cutting staff to match.
Take a look at this snip from the fed's chart on teacher employment. The first column is total teachers, column two is public, and column three is private (numbers are in thousands of teachers). 2011 is the last year for which we have hard numbers. Note that teacher employment peaked in 2008, and we've been declining since. Nothing like cutting 100,000 jobs to help reduce the number of vacancies you're trying to fill. Put another way, the study shows that 1999 was the worst in terms of unfilled jobs, but as we added more teachers, the vacancy percentages dropped. But then the last drop coincides with a drop in number of jobs to be filled. There are two ways to solve an unfilled vacancy problem, and we have now tried both. Which approach do you think is more likely to fix things in the long run?
I'm saving a link to this study, because I believe it sets the stage for what's to come. I expect that when the next data set is added from further inside the reformy abyss, we'll see charts with upward hooks. I believe that the story will be, "Well, things were getting better, but then ed reform switched into overdrive, and it all want to hell pretty quickly."In short, nothing in this report contradicts the perception that a troublesome teacher shortage has appeared in the last four years.
I get that the Teacher Shortage is a complicated issue, for reasons including the desire of everybody on every side of the education debates to use talk of the shortage to support whatever point they'd like to make. But this new report definitely doesn't make me think that everything's actually okay, and I look forward to seeing more data when it finally appears.
A study released this month by the National Center for Educational Statistics suggests that everything we think we know about the Great Teacher Shortage is wrong. Or at least, it was wrong as of four years ago. The study is pretty straightforward, and it's worth making a note of.
The writers are Nat Malkus of the American Institutes of Research with Kathleen Mulvaney Hoyer and Dinah Sparks of Activate Research, Inc. AIR is also in the test manufacturing biz (SBA is their baby) and Activate is a "woman-owned small business" in the metro DC area focusing on research and policy. They created the report under the aegis of NCES, an arm of the USED Institute of Educational Sciences, so while none of these are without blemish, this is not another Gates-funded fake research project.
The report looks at four samples of data from the 1999-2000, 2003-2004, 2007-2008 and 2011-2012 school years, and it looked for answers to fairly straightforward questions:
1) What percentage of schools reported teaching vacancies or hard-to-fill spots?
2) What percentage of schools found these positions related to particular subject areas?
3) Did persistent hard-to-fill spots correlate to any school characteristics?
The report is easy to read through and contains lots of charts, but the answers reached by the researchers are not necessarily what we might expect. Let me just hit the highlights.
The percentage of schools reporting vacancies and hard-to-fill spots in 2011-2012 was down from 1999-2000. In fact it's the lowest of the four years.
Those percentages were also uniformly down for all subject areas, including math and special ed.
High minority schools still experience more staff challenges than low-minority schools, but 2011-2012 was still dramatically lower.
Title I schools have a harder time than non-Title I schools, but 2011-2012 was still better than all other years (I'm just going to write "pickle" every time this is the case, to save myself some typing.)
Large schools have it harder than small schools, but pickle.
When comparing city, suburban, town and rural schools, the most staff challenged schools have shifted over the four year. Cities used to lead the challenge with rural schools having the lowest percentage of staff challenged schools. In 2011-2012, suburban schools reported the fewest problems. Cities still has the most, but in all four categories, pickle. Big pickle.
I don't know what explains the pickle, and to their credit, the reports writers take a stance of, "We're just here to show you the numbers, not to make wild-ass guesses about why the numbers are what they are." The appendices give some number breakdowns and report on methodology, and while I am no trained stats cruncher, I don't see anything that sets off whopping alarms.
So am I thinking that I'll just stand down because the teacher shortage turns out to be all in my head? No. No, I'm not.
First of all, I have a certain amount of trust in my head, so I don't just throw away my head's ideas willy-nilly. I am , however, open to the notion that the teacher shortage is partly an artifact of the media's tendency to focus on a story thread and magnify it (e.g. the great shark summer of 2001).
In Pennsylvania, I know exactly why the numbers would reflect a not-shortage of teachers-- we've been shedding jobs left and right, dropping 2000-5000 teacher jobs (depending on who's counting) every year for several years. This is doing a great job of setting the stage for a teacher shortage, as college students repeatedly declare a major in Anything But Teaching. The ABT major is actually leading some college ed departments to shrink or collapse. The choking off of the teacher pipeline sets the stage for a combination of overcrowded classrooms and an actual teacher shortage.
My reading of teacher shortage bulletins is that teacher shortages are highly localized, and while the study's sampling of around 8,000 districts would ordinarily be plenty, I have to believe that the specific samples could make a huge difference.
But mostly what these results say to me is, "Holy smokes! We have plunged into a bad place very quickly over the last four years!"
Take for instance Scott Walker's Wisconsin. Here's a piece that lists the growing effects of Walker's gutting of the state's education system-- from November of 2011. In other words, the most recent data sampling in the study was being gathered just as Wisconsin schools were starting to feel the crunch. Quick quiz: have things gotten better or worse in Wisconsin since 2011?
Or North Carolina, another state that moved rapidly from a progressive education-supporting agenda to a state intent on driving teachers out.
Over the past four years, things have gotten far worse pretty quickly in schools across the country, from Race to the Top to Common Core testing. And in 2011, schools were seeing the last of federal stimulus money that allowed schools to keep hiring. When the stimulus money ran out, many districts starting cutting staff to match.
Take a look at this snip from the fed's chart on teacher employment. The first column is total teachers, column two is public, and column three is private (numbers are in thousands of teachers). 2011 is the last year for which we have hard numbers. Note that teacher employment peaked in 2008, and we've been declining since. Nothing like cutting 100,000 jobs to help reduce the number of vacancies you're trying to fill. Put another way, the study shows that 1999 was the worst in terms of unfilled jobs, but as we added more teachers, the vacancy percentages dropped. But then the last drop coincides with a drop in number of jobs to be filled. There are two ways to solve an unfilled vacancy problem, and we have now tried both. Which approach do you think is more likely to fix things in the long run?
I'm saving a link to this study, because I believe it sets the stage for what's to come. I expect that when the next data set is added from further inside the reformy abyss, we'll see charts with upward hooks. I believe that the story will be, "Well, things were getting better, but then ed reform switched into overdrive, and it all want to hell pretty quickly."In short, nothing in this report contradicts the perception that a troublesome teacher shortage has appeared in the last four years.
I get that the Teacher Shortage is a complicated issue, for reasons including the desire of everybody on every side of the education debates to use talk of the shortage to support whatever point they'd like to make. But this new report definitely doesn't make me think that everything's actually okay, and I look forward to seeing more data when it finally appears.
Accelerated Reader Research Part 2
A little while ago I took a look at this silly piece of faux research from the Accelerated Reader people. But there was one puzzle I couldn't quite solve.
The study was reported as concluding that just a few minutes more reading time would produce fabulous results, but I wondered exactly how the researchers knew how much time the readers had spent on their independent reading.
Much ado is made in the report about the amount of time a student spends on independent reading, but I cannot find anything to indicate how they are arriving at these numbers. How exactly do they know that Chris read fifteen minutes every day but Pat read thirty. There are only a few possible answers, and they all raise huge questions.
In Jill Barshaw's Hechinger piece, the phrase "an average of 19 minutes a day on the software"crops up. But surely the independent reading time isn't based on time on the computer-- not when so much independent reading occurs elsewhere.
The student's minutes reading could be self-reported, or parent-reported. But how can we possibly trust those numbers? How many parents or children would accurately report, "Chris hasn't read a single minute all week."
Or those numbers could be based on independent reading time as scheduled by the teacher in the classroom, in which case we're really talking about how a student reads (or doesn't) in a very specific environment that is neither chosen nor controlled by the student. Can we really assume that Chris reading in his comfy chair at home is the same as Chris reading in an uncomfortable school chair next to the window?
Nor is there any way that any of these techniques would consider the quality of reading-- intensely engaged with the text versus staring in the general direction of the page versus skimming quickly for basic facts likely to be on a multiple choice quiz about the text.
The only other possibility I can think of is some sort of implanted electrodes that monitor Chris's brain-level reading activity, and I'm pretty sure we're not there yet. Which means that anybody who wants to tell me that Chris spent nineteen minutes reading (not twenty, and not eighteen) is being ridiculous.
When I was working on the piece, I tweeted at the AR folks to see if they could illuminate me. I didn't get an immediate response, which is not significant, because it's twitter, not registered mail. But I did hear back from them a bit later, and they directed me to this explanation from one of their publications about AR (it's page 36).
The Diagnostic Report also shows a calculation called engaged time. This represents the number of minutes per day a student was actively engaged in reading. To calculate this number, we look at the student’s GE score on STAR Reading and how many points the student has earned by taking AR quizzes. We compare that to the number of points we can expect the student to earn per minute of reading practice. Then we convert the student’s earned points to minutes.
For example, let’s say Joe Brown has a GE score of 6.5. Our research tells us that a student of his ability can earn 14 points by reading 30 minutes a day for six weeks. Joe has earned only seven points. Thus we estimate Joe’s engaged time to be only 15 minutes a day.
If a student’s engaged time is significantly lower than the amount of time you schedule for reading practice, investigate why. It could be that classroom routines are inefficient or books may be hard to access. Since low engaged time is tied to a low number of points earned, see the previous page for additional causes and remedies.
So, not any of the things I guesses. Something even worse.
They take the child's score on their proprietary reading skill test, they look at how many points the child scored, and they consult their own best guess at how long a student with that score would take to earn that many points-- and that's how much time the child must have spent reading!
What if something doesn't match up? What if the AR reverse-engineered time calculation says that Chris must have taken thirty minutes of reading to get that score, but you gave Chris an hour to read? Well then-- the problem is in your classroom. Chris is lollygagging or piddly-widdling. Or the books are on too high a shelf and it took Chris a half hour to get it. Whatever. The problem is not that AR's calculations are wrong.
And of course this doesn't so much answer the question as push it up the line. Exactly what research tells you that a student with STAR rating X must use fifteen minutes of reading to achieve Y number of points on the AR quiz?
My confidence in the Accelerated Reading program is not growing, and my confidence in their research skills, procedures or conclusions, is rapidly shrinking.
The study was reported as concluding that just a few minutes more reading time would produce fabulous results, but I wondered exactly how the researchers knew how much time the readers had spent on their independent reading.
Much ado is made in the report about the amount of time a student spends on independent reading, but I cannot find anything to indicate how they are arriving at these numbers. How exactly do they know that Chris read fifteen minutes every day but Pat read thirty. There are only a few possible answers, and they all raise huge questions.
In Jill Barshaw's Hechinger piece, the phrase "an average of 19 minutes a day on the software"crops up. But surely the independent reading time isn't based on time on the computer-- not when so much independent reading occurs elsewhere.
The student's minutes reading could be self-reported, or parent-reported. But how can we possibly trust those numbers? How many parents or children would accurately report, "Chris hasn't read a single minute all week."
Or those numbers could be based on independent reading time as scheduled by the teacher in the classroom, in which case we're really talking about how a student reads (or doesn't) in a very specific environment that is neither chosen nor controlled by the student. Can we really assume that Chris reading in his comfy chair at home is the same as Chris reading in an uncomfortable school chair next to the window?
Nor is there any way that any of these techniques would consider the quality of reading-- intensely engaged with the text versus staring in the general direction of the page versus skimming quickly for basic facts likely to be on a multiple choice quiz about the text.
The only other possibility I can think of is some sort of implanted electrodes that monitor Chris's brain-level reading activity, and I'm pretty sure we're not there yet. Which means that anybody who wants to tell me that Chris spent nineteen minutes reading (not twenty, and not eighteen) is being ridiculous.
When I was working on the piece, I tweeted at the AR folks to see if they could illuminate me. I didn't get an immediate response, which is not significant, because it's twitter, not registered mail. But I did hear back from them a bit later, and they directed me to this explanation from one of their publications about AR (it's page 36).
The Diagnostic Report also shows a calculation called engaged time. This represents the number of minutes per day a student was actively engaged in reading. To calculate this number, we look at the student’s GE score on STAR Reading and how many points the student has earned by taking AR quizzes. We compare that to the number of points we can expect the student to earn per minute of reading practice. Then we convert the student’s earned points to minutes.
For example, let’s say Joe Brown has a GE score of 6.5. Our research tells us that a student of his ability can earn 14 points by reading 30 minutes a day for six weeks. Joe has earned only seven points. Thus we estimate Joe’s engaged time to be only 15 minutes a day.
If a student’s engaged time is significantly lower than the amount of time you schedule for reading practice, investigate why. It could be that classroom routines are inefficient or books may be hard to access. Since low engaged time is tied to a low number of points earned, see the previous page for additional causes and remedies.
So, not any of the things I guesses. Something even worse.
They take the child's score on their proprietary reading skill test, they look at how many points the child scored, and they consult their own best guess at how long a student with that score would take to earn that many points-- and that's how much time the child must have spent reading!
What if something doesn't match up? What if the AR reverse-engineered time calculation says that Chris must have taken thirty minutes of reading to get that score, but you gave Chris an hour to read? Well then-- the problem is in your classroom. Chris is lollygagging or piddly-widdling. Or the books are on too high a shelf and it took Chris a half hour to get it. Whatever. The problem is not that AR's calculations are wrong.
And of course this doesn't so much answer the question as push it up the line. Exactly what research tells you that a student with STAR rating X must use fifteen minutes of reading to achieve Y number of points on the AR quiz?
My confidence in the Accelerated Reading program is not growing, and my confidence in their research skills, procedures or conclusions, is rapidly shrinking.
Thursday, November 26, 2015
Thankfulness
Thanksgiving is, well, problematic as a holiday. At least as a celebration of anything historic, because the related history is complicated, and if there's anything Americans hate, it's sorting our way through complicated history. We like our history sorted out into nice clear good guys and bad guys; unfortunately, actual human beings are rarely all good or all bad.
So my preference when celebrating Thanksgiving is to chuck the historical element completely, and embrace the holiday's best element, which is taking a day to be thankful.
There is being thankful for the most immediate circumstances of our lives. Last night, my daughter and her husband and her son and my son and his fiance all slept here in this house, which is the first time that has happened in years (actually, if you count my one-year-old grandson, it's the first time in ever). To have family all here, to make them breakfast-- that's all a huge blessing, and I am grateful for it.
I'm thankful that I get to continue working at one of the greatest jobs in the world. I'm thankful that my work situation is so much less contentious and difficult as the situations of so many of teachers throughout the country. I'm thankful that I live in a small town next to a river and only a few blocks away from family and the center of town. It really is a beautiful place, and I'm thankful that I have had the opportunity to participate in so many other enriching activities like playing in a band, working in community theater, and writing for the local paper.
Today many Americans are expressing similar sentiments, and that's a good start. But we Americans are not always good with the whole thankfulness thing, and when we're not careful, the day's expressions come out as some version of, "I am grateful that life/God/fate has provided me with all the benefits that I so richly deserve and have rightly earned." And that's not thankfulness; that's just self-congratulatory smugness.
It's not that our hard work and our efforts and our talents and the occasions on which we follow our higher virtues don't all have something to do with our successes. They do. It's important to Make Good Choices. No question about that.
But if hard work and smarts were all it took to become wealthy and successful, there would be millions of wealthy and successful people on every continent, and there aren't.
My success, such as it is, can be partly attributed to a handful of not-awful qualities that I occasionally managed to bring to bear on my situation. But my success is also the result of other factors. I'm successful because I was born in this country and not some other one with fewer resources and less stability. I'm successful because I was born into a family that could build a platform for me to stand on and build my own success upon. I'm successful because I've never been stuck by cancer, never caught by a random act of nature, never hit by an out-of-control delivery truck or a piece of Skylab. I'm successful because none of the spectacularly bad choices I've made in my life resulted in setbacks from which I could not recover.
In short, my success (such as it is) is not just the result of my own brains and hard work. It is also the result of fortunate elements over which I had no control and advantages that came to me without me doing a thing.
This does not mean I'm "really" a failure or that I actually suck.
What it means is that I have much to be thankful for, and the proper response to that kind of thankfulness is a sense of gratitude for what has come to me that I did not "earn," but from which I benefit. And there's no way to feel actual gratitude and express that gratitude as , "Well, I've got mine, Jack. So screw the rest of you." Because that gratitude has to live right next door to a sense of indebtedness.
I owe the universe or God (take your pick, suit yourself-- neither the presence nor absence of religion changes my feelings on this) a huge debt. I owe individual human beings as well. I owe my parents for helping me stand up and get out into the world. I owe the teachers like Ed and Mike and Janet a huge debt for awakening certain understandings in me. I owe guys like Ed and John for showing me how to get things accomplished in this world. I owe folks like Diane and Anthony and Nancy and Susan and a really, really long list of people who pushed my work on this blog out into the world. I owe the country that provided me with the stable world in which to find my business, and I owe the entire institution of public education for providing both a foundation for growth and the chance to pursue my line of work.
And that's before we get to this list of things that I owe a debt for because by my choices, I made the world a little bit worse. The times I hurt somebody in ways that can't easily be made right.
I owe for those things (and others) because none of them are things that I made happen myself.
So for me, it's never enough on this day to just sit back and say, "Boy, I am grateful that my life is awesome." It's a day to do accounting, to ask, "What can I do to pay down this debt?"
If I'm sitting at a table with a dozen other hungry people and out of nowhere, a waiter brings me plates piled high with food, food that I didn't order, food that I didn't pay for, it's not enough for me to look around the table at those other hungry faces and say, "Boy, I sure feel thankful for this." And it's not okay to just avoid the awkwardness by not meeting their eyes at all.
We owe the world. We owe our friends and family. We owe people who came before us whom we have never met. We owe the God who blessed us or the world that dropped good fortune on us. We owe the people we have taken things from, even if we didn't mean to, even if we didn't know.
So that's where I am on Thanksgiving. Not just grateful, though I surely am that, but also mindful that I owe it to the people around me, to my students, to my family, to my God-- to all of them, I owe an effort to be a better man, a better person, a better user of whatever small powers and talents that life has put in my hands. I'm not a huge person, an important person, but I'm still a person, and I still owe it to the universe to make the best of what I've been given, because I've been given far more than I have any right to expect. I don't necessarily control what is given to me, but I surely control what I choose to do with it.
So Happy Thanksgiving to you. Enjoy the day with family, and use it well.
So my preference when celebrating Thanksgiving is to chuck the historical element completely, and embrace the holiday's best element, which is taking a day to be thankful.
There is being thankful for the most immediate circumstances of our lives. Last night, my daughter and her husband and her son and my son and his fiance all slept here in this house, which is the first time that has happened in years (actually, if you count my one-year-old grandson, it's the first time in ever). To have family all here, to make them breakfast-- that's all a huge blessing, and I am grateful for it.
I'm thankful that I get to continue working at one of the greatest jobs in the world. I'm thankful that my work situation is so much less contentious and difficult as the situations of so many of teachers throughout the country. I'm thankful that I live in a small town next to a river and only a few blocks away from family and the center of town. It really is a beautiful place, and I'm thankful that I have had the opportunity to participate in so many other enriching activities like playing in a band, working in community theater, and writing for the local paper.
Today many Americans are expressing similar sentiments, and that's a good start. But we Americans are not always good with the whole thankfulness thing, and when we're not careful, the day's expressions come out as some version of, "I am grateful that life/God/fate has provided me with all the benefits that I so richly deserve and have rightly earned." And that's not thankfulness; that's just self-congratulatory smugness.
It's not that our hard work and our efforts and our talents and the occasions on which we follow our higher virtues don't all have something to do with our successes. They do. It's important to Make Good Choices. No question about that.
But if hard work and smarts were all it took to become wealthy and successful, there would be millions of wealthy and successful people on every continent, and there aren't.
My success, such as it is, can be partly attributed to a handful of not-awful qualities that I occasionally managed to bring to bear on my situation. But my success is also the result of other factors. I'm successful because I was born in this country and not some other one with fewer resources and less stability. I'm successful because I was born into a family that could build a platform for me to stand on and build my own success upon. I'm successful because I've never been stuck by cancer, never caught by a random act of nature, never hit by an out-of-control delivery truck or a piece of Skylab. I'm successful because none of the spectacularly bad choices I've made in my life resulted in setbacks from which I could not recover.
In short, my success (such as it is) is not just the result of my own brains and hard work. It is also the result of fortunate elements over which I had no control and advantages that came to me without me doing a thing.
This does not mean I'm "really" a failure or that I actually suck.
What it means is that I have much to be thankful for, and the proper response to that kind of thankfulness is a sense of gratitude for what has come to me that I did not "earn," but from which I benefit. And there's no way to feel actual gratitude and express that gratitude as , "Well, I've got mine, Jack. So screw the rest of you." Because that gratitude has to live right next door to a sense of indebtedness.
I owe the universe or God (take your pick, suit yourself-- neither the presence nor absence of religion changes my feelings on this) a huge debt. I owe individual human beings as well. I owe my parents for helping me stand up and get out into the world. I owe the teachers like Ed and Mike and Janet a huge debt for awakening certain understandings in me. I owe guys like Ed and John for showing me how to get things accomplished in this world. I owe folks like Diane and Anthony and Nancy and Susan and a really, really long list of people who pushed my work on this blog out into the world. I owe the country that provided me with the stable world in which to find my business, and I owe the entire institution of public education for providing both a foundation for growth and the chance to pursue my line of work.
And that's before we get to this list of things that I owe a debt for because by my choices, I made the world a little bit worse. The times I hurt somebody in ways that can't easily be made right.
I owe for those things (and others) because none of them are things that I made happen myself.
So for me, it's never enough on this day to just sit back and say, "Boy, I am grateful that my life is awesome." It's a day to do accounting, to ask, "What can I do to pay down this debt?"
If I'm sitting at a table with a dozen other hungry people and out of nowhere, a waiter brings me plates piled high with food, food that I didn't order, food that I didn't pay for, it's not enough for me to look around the table at those other hungry faces and say, "Boy, I sure feel thankful for this." And it's not okay to just avoid the awkwardness by not meeting their eyes at all.
We owe the world. We owe our friends and family. We owe people who came before us whom we have never met. We owe the God who blessed us or the world that dropped good fortune on us. We owe the people we have taken things from, even if we didn't mean to, even if we didn't know.
So that's where I am on Thanksgiving. Not just grateful, though I surely am that, but also mindful that I owe it to the people around me, to my students, to my family, to my God-- to all of them, I owe an effort to be a better man, a better person, a better user of whatever small powers and talents that life has put in my hands. I'm not a huge person, an important person, but I'm still a person, and I still owe it to the universe to make the best of what I've been given, because I've been given far more than I have any right to expect. I don't necessarily control what is given to me, but I surely control what I choose to do with it.
So Happy Thanksgiving to you. Enjoy the day with family, and use it well.
Wednesday, November 25, 2015
PA: Testing Good News & Bad News
This week the Pennsylvania House of Representatives voted to postpone the use of the Keystone Exam (Pennsylvania's version of the Big Standardized Test required by the feds) as a graduation requirement. The plan had been to make the Class of 2017 pass the reading, math and biology exams in order to get a diploma. The House bill pushes that back to 2019.
The House measure joins a similar Senate bill passed last summer. The only significant difference between the bills is that the House bill adds a requirement to search for some tool more useful than the Keystones. The bills should be easy to fit together, and the governor is said to support the two-year pause, so the postponement is likely to become law. And that is both good news and bad news.
Good News
The Keystone is a lousy test. It is so lousy that, as I was reminded in my recent Keystone Test Giver Training Slideshow, all Pennsylvania teachers are forbidden to see it, to look at it, to lays eyes on it, and, if we do somehow end up seeing any of the items, sworn to secrecy about it. But because I am a wild and crazy rebel, I have looked at the Keystone exam as well as the practice items released by the state, and in my professional opinion, it's a lousy test.
So it's a blessing that two more rounds of students will not have to pass the tests in order to graduate-- particularly as the feds bear down on their insistence that students with special needs be required to take the same test as everyone else, with no adaptations or modifications. The year the Keystones are made a graduation requirement is the year that many Pennsylvania students will fail to graduate, even though they have met all other requirements set by their school board and local district.
That will not be a good year.
Bad News
The tests will still be given, and they will still be used for other purposes. Those purposes include evaluating teachers, and evaluating schools.
Pennsylvania's School Performance Profile, looks like it based on a variety of measures (some of which are shaky enough-- PA schools get "points" for buying more of the College Board's AP products) but at least one research group has demonstrated that 90% of the SPP is based on test scores (one huge chunk for a VAMmy growth score and another for the level of the actual score).
So we will continue to have schools and teachers evaluated based on the results of a frustrating and senseless test that students know they have absolutely no stake in, and which they know serves no purpose except to evaluate teachers and schools. Get in there and do your very best, students!
Bonus Round
Of course, some districts tried to deal with that issue of getting student skin in the game by also phasing in a pass-the-test requirement as a local thing. So now a whole bunch of students who have been hearing that they'll have to pass the Keystones to graduate-- they'll be hearing that the state took that requirement away, except then someone will have to tell them that their local district did NOT take the requirement away. This should open up some swell discussion.
So How Do We Feel?
State Board of Education Chairman Larry Wittig (a CPA who was appointed to board by Tom Corbett) is sad, because he thinks the whole testing program is awesome and well-designed. Wittig's reaction is itself a mixed bag. On the one hand, he thinks that the testing system is "well-crafted" and beneficial to students, which is just silly, because the test is neither of those things. On the other hand, he also said this:
If I'm a teacher and in part my evaluation is based on the result of these tests and now the tests are meaningless, I'm going to have a problem with that.
And, well, yes. "Not stakes for students, high stakes for schools and teachers" kind of sucks as an approach.
And really, is there anyone in Harrisburg who wants to articulate the reasoning behind, "We don't have faith in this test's ability to fairly measure student achievement, but we do have faith in its ability to measure teacher achievement." No? No, there doesn't seem anybody trying to explain the inherent self-contradiction in this position.
Perhaps the House-sponsored search for a Better Tool will yield fabulous results. But in the meantime, we've already signed Data Recognition Corporation, Inc, to a five-to-eight year contract to keep producing the Keystone, even if we don't know what we want to use the tests for.
The good part of this news is undeniable. Two more years of students who will not have to clear a pointless, poorly-constructed hurdle before they can get their diplomas. That's a win for those students.
But to postpone the test rather than obliterate it, to keep the test in place to club teachers and schools over the head, to signal that you don't really have an idea or a plan or a clue about what the test is for and why we're giving it-- those are all big losses for teachers, for education, and for all the students who have more than two years left in the system.
The House measure joins a similar Senate bill passed last summer. The only significant difference between the bills is that the House bill adds a requirement to search for some tool more useful than the Keystones. The bills should be easy to fit together, and the governor is said to support the two-year pause, so the postponement is likely to become law. And that is both good news and bad news.
Good News
The Keystone is a lousy test. It is so lousy that, as I was reminded in my recent Keystone Test Giver Training Slideshow, all Pennsylvania teachers are forbidden to see it, to look at it, to lays eyes on it, and, if we do somehow end up seeing any of the items, sworn to secrecy about it. But because I am a wild and crazy rebel, I have looked at the Keystone exam as well as the practice items released by the state, and in my professional opinion, it's a lousy test.
So it's a blessing that two more rounds of students will not have to pass the tests in order to graduate-- particularly as the feds bear down on their insistence that students with special needs be required to take the same test as everyone else, with no adaptations or modifications. The year the Keystones are made a graduation requirement is the year that many Pennsylvania students will fail to graduate, even though they have met all other requirements set by their school board and local district.
That will not be a good year.
Bad News
The tests will still be given, and they will still be used for other purposes. Those purposes include evaluating teachers, and evaluating schools.
Pennsylvania's School Performance Profile, looks like it based on a variety of measures (some of which are shaky enough-- PA schools get "points" for buying more of the College Board's AP products) but at least one research group has demonstrated that 90% of the SPP is based on test scores (one huge chunk for a VAMmy growth score and another for the level of the actual score).
So we will continue to have schools and teachers evaluated based on the results of a frustrating and senseless test that students know they have absolutely no stake in, and which they know serves no purpose except to evaluate teachers and schools. Get in there and do your very best, students!
Bonus Round
Of course, some districts tried to deal with that issue of getting student skin in the game by also phasing in a pass-the-test requirement as a local thing. So now a whole bunch of students who have been hearing that they'll have to pass the Keystones to graduate-- they'll be hearing that the state took that requirement away, except then someone will have to tell them that their local district did NOT take the requirement away. This should open up some swell discussion.
So How Do We Feel?
State Board of Education Chairman Larry Wittig (a CPA who was appointed to board by Tom Corbett) is sad, because he thinks the whole testing program is awesome and well-designed. Wittig's reaction is itself a mixed bag. On the one hand, he thinks that the testing system is "well-crafted" and beneficial to students, which is just silly, because the test is neither of those things. On the other hand, he also said this:
If I'm a teacher and in part my evaluation is based on the result of these tests and now the tests are meaningless, I'm going to have a problem with that.
And, well, yes. "Not stakes for students, high stakes for schools and teachers" kind of sucks as an approach.
And really, is there anyone in Harrisburg who wants to articulate the reasoning behind, "We don't have faith in this test's ability to fairly measure student achievement, but we do have faith in its ability to measure teacher achievement." No? No, there doesn't seem anybody trying to explain the inherent self-contradiction in this position.
Perhaps the House-sponsored search for a Better Tool will yield fabulous results. But in the meantime, we've already signed Data Recognition Corporation, Inc, to a five-to-eight year contract to keep producing the Keystone, even if we don't know what we want to use the tests for.
The good part of this news is undeniable. Two more years of students who will not have to clear a pointless, poorly-constructed hurdle before they can get their diplomas. That's a win for those students.
But to postpone the test rather than obliterate it, to keep the test in place to club teachers and schools over the head, to signal that you don't really have an idea or a plan or a clue about what the test is for and why we're giving it-- those are all big losses for teachers, for education, and for all the students who have more than two years left in the system.
Subscribe to:
Posts (Atom)