Monday, November 30, 2015

CBE & The Data Bottleneck

Can you tell I've been doing a lot of reading about Competency Based Education lately?

While some proponents like to point to more human-friendly versions of CBE such as a personalized district in rural Alaska, the more common picture of CBE is of a huge data-mining monstrosity. And while CBE has been rolling steadily at us under various aliases for a few decades now, it is computer technology that has made it look like both achievable and profitable.

In fact, CBE on the ground really does look like one more variation of the old and failed teaching machines, an intent to convert the entire ed system to the failed model of Rocketship Academy or the very failed model of on-line schooling.

You don't have to dig very far for hints about where CBE is headed. One of the flagship groups leading the charge is iNACOL -- which stands for "International Association for K-12 Online Learning." You can find their logo on works like the report presented by CompetencyWorks (what the hell is it with these guys and smooshedtogether group names?) entitled Re-Engineering Information Technology, a report all about how to redesign your IT systems to accommodate CBE.

Like many CBE fans, these guys are wrestling with the challenge of collecting tons of data, crunching it, making it transparent to students and teachers, and using it to make quick decisions about what should happen next in the student's education.

I'm hearing and reading the stories from teachers on the ground, in classrooms that are in part or in full running CBE, and they all seem to be about getting data through the bottleneck. Teachers who spend hours plugging test/quiz/worksheet scores into their platform. Teachers who maintain data walls on steroids so that students can walk into the room and first thing in the morning see where they are on the standards matrix and task completion matrix. Teachers who are directed to keep the students on those iPads for a significant portion of the day.

Computers become attractive in a CBE approach not because they do a better job of teaching (they don't) or because they are more engaging for students (they aren't) but because nothing else can compare for the speed and efficiency of gathering up the data. To wait for a human to process, score, record, and do data entry on class sets of papers-- that's just too long, too inefficient (plus, if those teachers haven't been properly freed from the tyranny of a union, they might balk at being required to put in fourteen hour days just so they can handle their hours of data entry).

So once again, the technology isn't there to serve education or the students, but to serve the people who think their program is magical. Only computers can clear the data bottleneck and get that sweet, sweet data flowing, and if that means we have to design all tests and worksheets and lessons and objectives so that they are the kind of thing that a computer can easily handle as opposed to, say, the kind of things that actually educate students-- well, the needs of the system outweigh the needs of the humans involved in it.

That's why CBE is destined to be nothing but OBE dressed up as the biggest cyber-school ever. It may not be great education, but at least the data trains run on time.

Chugach, Alaska

As reformy advocacy shifts toward promotion of Competency Based Education (or Proficiency Based Learning-- they have really got to settle on the set of buzzwords they want to use), we are going to hear now and then about a magical place in Alaska-- the Chugach School District.

Back in the nineties, when Objective Based Education (the previous iteration of CBE) was all the rage, Chugach signed up in a big way. They developed an OBE system that is now bills itself as the first competency based school district in the country.When edutopia visited in 2007, they found a system that was the pinnacle of performance-based learning. The district had over a thousand standards, and students had to achieve mastery of each before moving on to the next. Students also design their own projects and a "school-to-life" plan. And the Voyage to Excellence program is a self-directed process with a big vocational-technical flavor. The leader of the district during the switch repeated one of the mantras of OBE:

"Time was the constant and learning was the variable -- that's the old model," says Roger Sampson, president of the Education Commission of the States, who led Chugach's transformation as district superintendent in the 1990s. "We switched. What's constant is learning. Time is the variable."

Or as is noted elsewhere in the article:

Even as globalization and media propel our culture -- and our classrooms -- toward modes of production that are bigger, faster, and more alike, Chugach has refocused on an approach to education that is smaller, personalized, and variably paced. As Douglas Penn, the districtwide principal, explains, "Our kids graduate when they're ready. We're not pumping them out the door with D's on their diplomas."

And "graduate when they're ready" means just what it says. When the district won a Malcolm Baldrige Quality Award in 2001, the write-up noted that students might graduate when they fourteen or when they were twenty-one.

The accolades have been steady. Here's a piece from the John Hopkins School of Education, written by Wendy Battino, a teacher-principal with Chugach who went on to join the Re-Inventing Schools Coalition (as well as a career as a life coach). Here's an edsurge paean. But, boy-- nobody loves Chugach like CompetencyWorks.org, which ran a five-part series in January of 2015 (here's part five).

In 2001, then-superintendent Richard DeLorenzo had this to say about the district's vision and their place in the educational firmament:

Education is in a crisis due to the fact that we must now educate all students regardless of their potential or socio-economic status to some degree of excellence. Relying on traditional methodology and practice will only lead to tinkering with mediocrity where we fail to meet the needs of individuals. In order to accomplish excellence we need to radically alter what we teach and how we teach. We at Chugach have undertaken this journey and have dismantled many of the barriers that were once thought unapproachable to reach excellence in education. We have endured many hardships and disappointments and yet we still proceed with this tiresome journey because every student deserves the chance to be successful and share the opportunity to reach their full potential.

So-- yay! Dismantling barriers. The end of "tinkering with mediocrity." I can see the appeal to reformsters. But after twenty years, the system seems to be working in Chugach Schools. Could it be a model system for the rest of us?

Well....

Here are some things to know about the Chugach School District.

* The largely rural district covers about 22,000 square miles, including some square miles which are islands.
* Number of students in the system has ranged from 150 to 300, depending. The district markets itself to students outside its geographical boundaries.
* 77% of the students are homeschooled.
* The district generally employs fewer than twenty full-time faculty.

Let's set aside the argument about "mastery learning" for a moment (at exactly what point does one declare that a student has "mastered" reading?). We'll also set aside some questions about whether Chugach really did involve all stakeholders as their Baldrige write-up suggests, or whether this researcher was correct to conclude that political maneuvering of a ham-fisted "visionary" drove the bus. Let's just check this idea for scaleability.

Let's imagine, for instance, Chicago, where students (public and charter) run around 400,000. Exactly what would a system where 400,000 students pursued 1,000 objectives independently look like? Would we, like Chugach, have 300,000 of those students home schooled, so that their families are responsible for making sure the student stays on task? Chugach requires students working on certain types of projects to contact and get advice from professionals. So if 25,000 Chicago students decide they want to do a photography project, where will all 25K turn for advice?

The system allows students to finish whenever they get there. How would that play out in a poor urban setting where there are already so many obstacles to school completion? What does a bright fourteen year old who has breezed through all the performance tasks and graduated "early" do next?

How does a staff of teachers monitor 400,000 students all working at their own pace? And how do parents react when they learn, as Chugach parents have, that at any given point, every child's report card may look different?

What does it do to the cohesion and culture of a school when students must choose between moving forward to their next standard and staying with their friends? How badly does it crush a child's confidence to be among those "left behind." I'm not asking because I'm afraid students might feel bad, but because I know these kind of blows to the ego and self really interfere with learning. With a predominantly homeschooled population, Chugach provides no window on how this kind of system affects the culture inside a building.

For reformsters who love CBE, Chugach is a model of how paradisey the competency based model can be. But to me, it's just one more example of how one size doesn't fit all, and that the continued search for a magical school approach that can be applied to any district anywhere is a fool's errand. Chugach is very unique system with very unique challenges that has landed on a very unique solution.

Chugach's approach may very well work for Chugach, a very rural district of a very few, predominantly home-schooled students.  But if someone starts telling me that Chugach is a reason to believe that CBE will be awesome everywhere, I'm going to assume that they are more interested in selling snake oil than helping schools.





Sunday, November 29, 2015

ICYMI: Some Sunday Edureads

It will be a quickie this week-- I have both of my children home and a grandson's birthday party to attend!

Eva Moskowitz Cannot Help Herself

Daniel Katz provides one of the best overviews of Moskowitz's ongoing meltdown. A study in how privilege, money and power can make you blind to how you're behavior is playing in the real world.

How Twisted Early Childhood Education Has Become

Early childhood ed has arguably been more badly damaged by reformsters than any other segment of the education biz. Sometimes it helps to have someone take a step back, show how far off track we have gotten, and help you realize you're not crazy for thinking we're getting early childhood ed completely wrong at this point.

Competency Based Ed: The Culmination of the Common Core Agenda

A good collection of the many pieces and points of view springing up as CBE becomes the newest topic of the education debates.

Five Perspectives on Student Fragility

At Psychology Today, Peter Gray has been running a series about the increasing fragile nature of our students, including theories about the source. This latest installment is interesting because it includes the many, many reactions from various stakeholders in that discussion.

Are You Being Served?

Nobody combines humor and actual journalism better than Jennifer Berkshire at Edushyster. Here's a look at the facts of which students Boston charter schools are actually serving.

Saturday, November 28, 2015

Remember Outcome Based Education?

Because of massive technological, economic and social changes, we are challenged to boost standards of student performance substantially, especially among those who in the past were least successful. The educational sector apparently will not have more money, so we cannot expect salaries to be more attractive or other resources more plentiful. The alternative, say thoughtful observers, is to restructure. 

That quote comes from Ron Brandt, the Executive Editor for the Association for Supervision and Curriculum Development. In 1994. It comes from Brandt's introduction to the book Outcome Based Education: Critical Issues and Answers, by William Spady. More about him in a moment.

If you are a teacher of a Certain Age, you remember Outcome Based Education. OBE started popping up in the US in the early 90s. While one of its features was a certain vagueness (Brandt wrote in an ASCD overview that "OBE is more of a philosophy than a uniform set of practices"). But now that Competency Based Education is auditioning for Educational Thing Du Jour, pulling out the OBE notebook seems apropos.

OBE attracted my attention when it first appeared because it sounded suspiciously like Management By Objectives, a management technique developed by Peter Drucker. Watching old insights from MBO appear in OBE was first led to my theory that when management consultants have finally saturated the business market, they go through their materials, cross out the biz buzz words, and pencil in education jargon and voila!-- they are back in business.

But if OBE was transmogrified from the business world, so what? Was it any good?

The central philosophical shift was to move from time-based schooling to objective based. in other words, the traditional constant in school is time, and the variable is learning. We only have 180 days-- how much can we get done in those days? OBE said, "Let's list what learning objectives we want the students to achieve, and time will be the variable."

The self-proclaimed father of OBE is the above-mentioned Bill Spady, a sociologist who started pioneering OBE in the mid-eighties. He became the director of the International Center on Outcome-Based Restructuring, and continues to work in education today. If you really want to know all about Spady, a John Anthony Hader wrote his dissertation about Spady and his work.

Spady was notoriously unwilling to give exact instructions for setting up OBE, insisting that objectives had to be locally developed. But he did lay down some guiding principles, some of which are listed here by his colleague Brandt. 

* Clarity of focus. Your outcome has to be focused and specific.
* Design down, deliver up. Work backwards from your objective to design programs, but work toward the objective from wherever the students are.
* High expectations. Specifically (if we heard this once, we heard it a million times) believe that all students can learn all.
* Expanded opportunities. Provide students many chances and many ways to show they have achieved the objective.

Additionally, OBE acquired various corollaries, implications, and add-ons. If we were going to insist that all students can learn all, then we had better settle on objectives that all students can learn (let the dumbing down begin). For some reason, cooperative learning became closely tied to OBE in many regions. And the prospect of wreaking havoc with the school year-- headaches! If Chris can meet all objectives by Christmas, can Chris then go home? Or does Chris just start the next "grade"? And what if Chris is still not getting it in July-- does Chris's school year continue until the last objective is met? Logistically, how does that even work? And how do you write a teacher contract that says, "Depending on how well you do, you are hired for something between 100 and 300 days." Or do you just pay teachers for piecework ($100 per every student objective met)?

Objectives themselves were problematic. This was the dawn of TSWBAT (the student will be able to...) which meant that every single objective had to be paired with some observable student behavior. This has eternally been an educational challenge (did Chris learn to understand the Iliad, or did Chris figure out how to act like Chris understands). But OBE threw its weight on the side of observable behavior, encouraging teachers to require student performance rather than teacher inquiry to assess.

OBE caught on big time, until-- and I say this with both pride and shame-- Pennsylvania broke it.

Pennsylvania was poised to weave OBE into the warp and woof of state education regulation. Many of us went to professional development sessions to prepare us for the Big Shift. But instead, this time, shift never happened.

Some of it was not Pennsylvania's fault. The OBE fans had missed one of the implications of their own work, which was the the objectives would need to be clearly measurable. Instead, various versions of OBE were peppered with what we now call non-cognitive objectives. And not just non-cognitive, but politically charged as well. Here are some contributions to the genre:

All students understand and appreciate their worth as unique and capable individuals, and exhibit self-esteem.

All students apply the fundamentals of consumer behavior to managing available resources to provide for personal and family needs.

All students make environmentally sound decisions in their personal and civic lives.

OBE programs has a variety of objectives like these, and conservatives freaked. Rush Limbaugh, Bill Bennett, Pat Robertson and most especially Phyllis Schafly were sure that OBE was here to socially engineer your child into some bleeding heart gay-loving liberal twinkie.  

OBE was also vulnerable because there wasn't a lick of evidence or research to indicate that it actually worked. And because it was focused on locally-selected objectives that could be met in a variety of ways, there wasn't even any way to tell if it was working at all.

Opponents were also taken aback by the electronic portfolio. OBE demanded a portfolio system in which the many and varied objective-meeting projects of students could be gathered, but then some computer-enamored mook decided that an electronic portfolio, that could be stored in perpetuity and could follow the students anywhere-- that would be cool! Is any of this starting to sound vaguely familiar?

And because Spady and his brethren refused to give specific instructions, OBE looked like a thousand different things, some of which seemed directly contradictory. In Pennsylvania, the initial version of OBE state education regs included roughly 550 objectives. According to Hader's oral history, Spady told them they were about 540 off; the education department rapidly backpedaled while begging Spady to come write the objectives for them. Then Peg Luksik activiated her formidable army or conservatives to attack OBE, and the whole business started to collapse. Pennsylvania broke OBE, and it never quite recovered.

When I started to hear about Performance/Competency Based Education, I initially thought that it would be the reheated leftovers of OBE. I cringed, because I remember the training and the insistence that all students can learn everything and the crazy barrage of ever-shifting state directives. Pennsylvania's OBE initiative came at the end of my first decade in the classroom, and it marked the point at which I suddenly realized that the policy leaders and educational bureaucrats on the state level might not know what the hell they were talking about. But I also remembered OBE's complete and utter collapse and thought, "Well, this will die quickly."

But CBE turns out to be a different sort of OBE, an OBE with its holes plugged by sweet, sweet technology and its foundation shored up with Common Core college and career ready standards. Where OBE was all loosey goosey with whatever standards and objectives the locals wanted, CBE will help you get a list of standards/objectives already in place-- and some vendors will throw in the assessments and performance tasks and the software to measure them as well as recording the results as well as using those results to decide which pre-packaged lesson your student should do next.

Technology also aids in the variable-time logistics problem. Now, instead of puzzling over whether behind-on-objective Chris must stay in school through July, we can just get Chris to use internet connections to make the school day fourteen hours long. Of course, we still have the puzzle of what to do if Chris completes an entire grade level's worth of objectives over the weekend.

Technology also ups the ante on that electronic portfolio, the data backpack that will follow your student throughout life. Of course, in some schools that currently means that a teacher's primary function is endless data entry. But since the performance tasks are on the computer, the teacher will be spending far less time teaching anyway.

Most of all, technology underlines the classic problem with OBE-- the notion that education is just learning to perform a series of designated tasks, like a team on the Amazing Race. Education is just working your way down a checklist, and once everything on the list is checked off-- congratulations! You're an educated person! That's all it there is to it! Of course, that also takes us back to the problem that killed OBE the last time-- exactly who gets to decide which tasks go on that checklist?

As I've said, I have my doubts about CBE's chance to take over the education world. Its resemblance to OBE doesn't improve my estimation of its odds. 

Friday, November 27, 2015

Can Competency Based Education Be Stopped?

Over at StopCommonCoreNYS, you can find the most up-to-date cataloging of the analysis of, reaction to, and outcry over Competency Based Education.

Critics are correct in saying that CBE has been coming down the pike for a while. Pearson released an 88-page opus about the Assessment Renaissance almost a year ago (you can read much about it starting here). Critics noted way back in March of 2014 (okay, I'm the one who noted it) that Common Core standards could be better understood as data tags. And Knewton, Pearson's data-collecting wing, was explaining how it would all work back in 2012.

Every single thing a student does would be recorded, cataloged, tagged, bagged, and tossed into the bowels of the data mine, where computers will crunch data and spit out a "personalized" version of their pre-built educational program.

Right now seems like the opportune moment for selling this program, because it can be marketed as as an alternative to the Big Standardized Tests which have been crushed near to death under the wheel of public opinion. "We'll stop giving your children these stupid tests," the reformsters declare. "Just let us monitor every single thing they do every day of the year."

It's not that I don't think CBE is a terrible idea-- I do. And it's not that I don't have a healthy respect for and fear of this next wave of reformy nonsense. But I can't shake the feeling that while reformsters think they have come up with the next generation iPhone, they're actually trying to sell us a quadrophonic laser disc player.

From a sales perspective, CBE has several huge problems

Been There, Done That

Teaching machines first cropped up in the twenties, running multiple choice questions and BF Skinner-flavored drill. Ever since, the teaching machine concept has kept popping up with regularity, using whatever technology was at hand to enact the notion that students can be programmed to be educated just like a rat can be programmed to run a maze.

Remember when teaching machines caught on and swept the nation because they provided educational results that parents and students loved? Yeah, neither does anybody else, because it never happened. The teaching machine concept has been tried, each time accompanied with a chorus of technocrats saying, "Well, the last time we couldn't collect and act on enough data, but now we've solved that problem."

Well, that was never the problem. The problem is that students aren't lab rats and education isn't about learning to run a maze. The most recent iteration of this sad, cramped view of humans and education was the Rocketship Academy chain, a school system built on strapping students to screens that would collect data and deliver personalized learning. They were going to change the whole educational world. And then they didn't.

Point is, we've been trying variations on this scheme for almost 100 years, and it has never caught on. It has never won broad support. It has never been a hit.

Uncle Sam's Big Fat Brotherly Hands

Remember how inBloom had to throw up its hands in defeat because the parents of New York State would not stand for the extensive, unsecured and uncontrolled data mining of their children. inBloom tried to swear that the kind of data mining and privacy violation and unmonitored data sharing that parents feared just wouldn't happen on their watch. But the CBE sales pitch doesn't just refuse to protect students against extensively collected and widely shared data mining-- CBE claims the data grubbing is not only not a danger, but is actually a valued feature of the program.

The people who thought inBloom was a violation of privacy and the people that thought Common Core was a gross federal overreach-- those people haven't suddenly disappeared. Not only that, but when those earlier assaults on education happened people were uneducated and unorganized-- they didn't yet fully grasp what was actually happening and they didn't have any organizations or other aggrieved folks to reach out to. Now all the networks and homework are already done and in place.

I don't envision folks watching CBE's big data-grabbing minions coming to town and greeting them as liberators. CBE is more of what many many many people already oppose.

No Successes To Speak Of

This has always been a problem for reformsters. "Give me that straw," they say, "and I will spin it into gold." They've had a chance to prove themselves with every combination of programs they could ask for, and they have no successes to point to. Remember all those cool things Common Core would accomplish? Or the magic of national standardized testing? The only people who have made a respectable job of touting success are the charteristas-- and that's not because they've actually been successful, but because they've mustered enough anecdotes and data points to cobble together effective marketing. It's lies, but it's effective.


Everything else? Bupkus. This will be no different. CBE will be piloted somewhere, and it will fail. It will fail because its foundation combines ignorance of what education is, how education works, and how human beings work.

Anchored to What?

A CBE system needs to be linked to some sort of national standards, but only those who have been very well paid have a deep commitment to them are still even speaking the name of Common Core. To bag and tag a nation's worth of data, you must have common tags. But we've already allowed states to drift off into their own definitions of success, their own tests, their own benchmarks. Saying, "Hey, let's all get on the same page" is not quite as compelling as it once was, because we've tried it and it sucked. As the probably successor to ESEA says, centralized standardization of education is not a winning stance these days. So to what will the CBE be anchored?

Expensive As Hell

Remember how expensive it was to buy all new books and get enough computers so that every kid could take a BS Test? You can bet that taxpayers do. Those would be the same taxpayers who saw programs and teachers cut from their schools even as there was money, somehow, for expensive but unnecessary new texts and computers (which in some cases could be used only for testing).

When policy makers announce, "Yeah, here's all the stuff you need to buy in order to get with the CBE program," taxpayers are going to have words to say, and they won't be happy, sweet words.

If every single worksheet, test, daily assessment, check for understanding, etc is going to go through the computer, that means tons of data entry OR tons of materials on the computers, through the network, etc etc etc. The kind of IT system required by a CBE system would be daunting to many network IT guys in the private sector (all of whom are getting paid way more than a school district's IT department). It will be time-consuming, buggy, and consequently costly.

Who wants to be the superintendent who has to say, "We're cutting more music and language programs because we need the money to make sure that every piece of work your child does is recorded in a central data base." Not I.

Program Fatigue

For the first time, the general taxpaying public may really get what teachers are feeling when they roll their eyes and say, "A NEW program? Even though we haven't really finished setting up the old one?!"  

Bottom Line

I think that CBE is bad education and it needs to be opposed at every turn. But I also think that reformsters are severely miscalculating just how hard a sell it's going to be. We can help make it difficult by educating the public.

There will be problems. In particular, CBE will be a windfall for the charter industry if they play their cards right. The new administration will play a role in marketing this and I see no reason to imagine that any of the candidates won't help market this if they win. (Well, Sanders might stand up to the corporate grabbiness of it, and Trump will just blow up all the schools.)

But there will be huge challenges for the folks who want to sell us this Grade C War Surplus Baloney. It's more of a product that nobody wanted in the first place. We just have to keep reminding them why they didn't like it.

Is the Teacher Shortage Real?

We talk a lot about the current teacher shortage. I've posted about it numerous times. But the question remains-- is there really a teacher shortage?

A study released this month by the National Center for Educational Statistics suggests that everything we think we know about the Great Teacher Shortage is wrong. Or at least, it was wrong as of four years ago. The study is pretty straightforward, and it's worth making a note of.

The writers are Nat Malkus of the American Institutes of Research with Kathleen Mulvaney Hoyer and Dinah Sparks of Activate Research, Inc. AIR is also in the test manufacturing biz (SBA is their baby) and Activate is a "woman-owned small business" in the metro DC area focusing on research and policy. They created the report under the aegis of NCES, an arm of the USED Institute of Educational Sciences, so while none of these are without blemish, this is not another Gates-funded fake research project.

The report looks at four samples of data from the 1999-2000, 2003-2004, 2007-2008 and 2011-2012 school years, and it looked for answers to fairly straightforward questions:

1) What percentage of schools reported teaching vacancies or hard-to-fill spots?

2) What percentage of schools found these positions related to particular subject areas?

3) Did persistent hard-to-fill spots correlate to any school characteristics?

The report is easy to read through and contains lots of charts, but the answers reached by the researchers are not necessarily what we might expect. Let me just hit the highlights.

The percentage of schools reporting vacancies and hard-to-fill spots in 2011-2012 was down from 1999-2000. In fact it's the lowest of the four years.

Those percentages were also uniformly down for all subject areas, including math and special ed.

High minority schools still experience more staff challenges than low-minority schools, but 2011-2012 was still dramatically lower.

Title I schools have a harder time than non-Title I schools, but 2011-2012 was still better than all other years (I'm just going to write "pickle" every time this is the case, to save myself some typing.)

Large schools have it harder than small schools, but pickle.

When comparing city, suburban, town and rural schools, the most staff challenged schools have shifted over the four year. Cities used to lead the challenge with rural schools having the lowest percentage of staff challenged schools. In 2011-2012, suburban schools reported the fewest problems. Cities still has the most, but in all four categories, pickle. Big pickle.

I don't know what explains the pickle, and to their credit, the reports writers take a stance of, "We're just here to show you the numbers, not to make wild-ass guesses about why the numbers are what they are." The appendices give some number breakdowns and report on methodology, and while I am no trained stats cruncher, I don't see anything that sets off whopping alarms.

So am I thinking that I'll just stand down because the teacher shortage turns out to be all in my head? No. No, I'm not.

First of all, I have a certain amount of trust in my head, so I don't just throw away my head's ideas willy-nilly. I am , however, open to the notion that the teacher shortage is partly an artifact of the media's tendency to focus on a story thread and magnify it (e.g. the great shark summer of 2001).

In Pennsylvania, I know exactly why the numbers would reflect a not-shortage of teachers-- we've been shedding jobs left and right, dropping 2000-5000 teacher jobs (depending on who's counting) every year for several years. This is doing a great job of setting the stage for a teacher shortage, as college students repeatedly declare a major in Anything But Teaching. The ABT major is actually leading some college ed departments to shrink or collapse. The choking off of the teacher pipeline sets the stage for a combination of overcrowded classrooms and an actual teacher shortage.

My reading of teacher shortage bulletins is that teacher shortages are highly localized, and while the study's sampling of around 8,000 districts would ordinarily be plenty, I have to believe that the specific samples could make a huge difference.

But mostly what these results say to me is, "Holy smokes! We have plunged into a bad place very quickly over the last four years!"

Take for instance Scott Walker's Wisconsin. Here's a piece that lists the growing effects of Walker's gutting of the state's education system-- from November of 2011. In other words, the most recent data sampling in the study was being gathered just as Wisconsin schools were starting to feel the crunch. Quick quiz: have things gotten better or worse in Wisconsin since 2011?

Or North Carolina, another state that moved rapidly from a progressive education-supporting agenda to a state intent on driving teachers out.

Over the past four years, things have gotten far worse pretty quickly in schools across the country, from Race to the Top to Common Core testing. And in 2011, schools were seeing the last of federal stimulus money that allowed schools to keep hiring. When the stimulus money ran out, many districts starting cutting staff to match.


Take a look at this snip from the fed's chart on teacher employment. The first column is total teachers, column two is public, and column three is private (numbers are in thousands of teachers). 2011 is the last year for which we have hard numbers. Note that teacher employment peaked in 2008, and we've been declining since. Nothing like cutting 100,000 jobs to help reduce the number of vacancies you're trying to fill. Put another way, the study shows that 1999 was the worst in terms of unfilled jobs, but as we added more teachers, the vacancy percentages dropped. But then the last drop coincides with a drop in number of jobs to be filled. There are two ways to solve an unfilled vacancy problem, and we have now tried both. Which approach do you think is more likely to fix things in the long run?


I'm saving a link to this study, because I believe it sets the stage for what's to come. I expect that when the next data set is added from further inside the reformy abyss, we'll see charts with upward hooks. I believe that the story will be, "Well, things were getting better, but then ed reform switched into overdrive, and it all want to hell pretty quickly."In short, nothing in this report contradicts the perception that a troublesome teacher shortage has appeared in the last four years.

I get that the Teacher Shortage is a complicated issue, for reasons including the desire of everybody on every side of the education debates to use talk of the shortage to support whatever point they'd like to make. But this new report definitely doesn't make me think that everything's actually okay, and I look forward to seeing more data when it finally appears.

Accelerated Reader Research Part 2

A little while ago I took a look at this silly piece of faux research from the Accelerated Reader people. But there was one puzzle I couldn't quite solve.

The study was reported as concluding that just a few minutes more reading time would produce fabulous results, but I wondered exactly how the researchers knew how much time the readers had spent on their independent reading.

Much ado is made in the report about the amount of time a student spends on independent reading, but I cannot find anything to indicate how they are arriving at these numbers. How exactly do they know that Chris read fifteen minutes every day but Pat read thirty. There are only a few possible answers, and they all raise huge questions.

In Jill Barshaw's Hechinger piece, the phrase "an average of 19 minutes a day on the software"crops up. But surely the independent reading time isn't based on time on the computer-- not when so much independent reading occurs elsewhere.

The student's minutes reading could be self-reported, or parent-reported. But how can we possibly trust those numbers? How many parents or children would accurately report, "Chris hasn't read a single minute all week."

Or those numbers could be based on independent reading time as scheduled by the teacher in the classroom, in which case we're really talking about how a student reads (or doesn't) in a very specific environment that is neither chosen nor controlled by the student. Can we really assume that Chris reading in his comfy chair at home is the same as Chris reading in an uncomfortable school chair next to the window?

Nor is there any way that any of these techniques would consider the quality of reading-- intensely engaged with the text versus staring in the general direction of the page versus skimming quickly for basic facts likely to be on a multiple choice quiz about the text.

The only other possibility I can think of is some sort of implanted electrodes that monitor Chris's brain-level reading activity, and I'm pretty sure we're not there yet. Which means that anybody who wants to tell me that Chris spent nineteen minutes reading (not twenty, and not eighteen) is being ridiculous.


When I was working on the piece, I tweeted at the AR folks to see if they could illuminate me. I didn't get an immediate response, which is not significant, because it's twitter, not registered mail. But I did hear back from them a bit later, and they directed me to this explanation from one of their publications about AR (it's page 36).

The Diagnostic Report also shows a calculation called engaged time. This represents the number of minutes per day a student was actively engaged in reading. To calculate this number, we look at the student’s GE score on STAR Reading and how many points the student has earned by taking AR quizzes. We compare that to the number of points we can expect the student to earn per minute of reading practice. Then we convert the student’s earned points to minutes. 

For example, let’s say Joe Brown has a GE score of 6.5. Our research tells us that a student of his ability can earn 14 points by reading 30 minutes a day for six weeks. Joe has earned only seven points. Thus we estimate Joe’s engaged time to be only 15 minutes a day. 

If a student’s engaged time is significantly lower than the amount of time you schedule for reading practice, investigate why. It could be that classroom routines are inefficient or books may be hard to access. Since low engaged time is tied to a low number of points earned, see the previous page for additional causes and remedies.

So, not any of the things I guesses. Something even worse.

They take the child's score on their proprietary reading skill test, they look at how many points the child scored, and they consult their own best guess at how long a student with that score would take to earn that many points-- and that's how much time the child must have spent reading!

What if something doesn't match up? What if the AR reverse-engineered time calculation says that Chris must have taken thirty minutes of reading to get that score, but you gave Chris an hour to read? Well then-- the problem is in your classroom. Chris is lollygagging or piddly-widdling. Or the books are on too high a shelf and it took Chris a half hour to get it. Whatever. The problem is not that AR's calculations are wrong.

And of course this doesn't so much answer the question as push it up the line. Exactly what research tells you that a student with STAR rating X must use fifteen minutes of reading to achieve Y number of points on the AR quiz?

My confidence in the Accelerated Reading program is not growing, and my confidence in their research skills, procedures or conclusions, is rapidly shrinking.