William Bennett appeared on Campbell Brown's reformster PR site to stick up for the Common Core, but he ignores some inconvenient truths in the process.
The first stretcher is in his thesis-title: the GOP is wrong to run away from the Common Core-- because the standards are working. "Working" is a heck of a subjective term here, but let's see where he's going, shall we?
He starts with some history, noting that many GOP governors who used to love the Core have decided to dump the standards because it's politically expedient to do so. He is not wrong, but he conveniently ignores parts of the story. Perhaps most notable is that so many of these states actually adopted the standards before they were actually written. Bennett also gives a head nod to the notion that "federal overreach" sullied the otherwise beauteous standards, as if the standards would have had a chance of adoption without the full force of federal coercion and cash behind them (spoiler alert: they would not have).
So the story is not, "States adopted educational standards because they examined the standards and decided that Common Core would make education in their states great. Now those same state leaders are dumping the Core for crass political reasons."
No, the story is, "Some politicians adopted a policy because they thought it would be politically (and financially) advantageous to do so, and then dropped that policy when it became politically advantageous to do so." This is not a new story, and it is not a surprising story, and the degree to which career politicians pretend to be surprised by it is baffling.
Bennett correctly calls out Chris Christie for the hypocrisy of dismissing the Core without making any "substantive policy changes." That's fair, but again-- adopting the Core was a political gesture, and so is disowning it. I'm shocked-- shocked, I tell you.
Bennett then embarks on a journey of logic-chopping and baloney-slicing.
Christie recognizes that New Jersey still needs tough, internationally benchmarked standards that resemble CCSS.
Well, except that CCSS is not internationally benchmarked, and never has been. And the word "tough" is meaningless rhetoric. Something can be tough and still be a waste of everyone's time, like sitting through the film version of Les Mis or listening to twenty-four straight hours of heavy metal polka music.
Many polls indicate that the American people support higher and more rigorous standards and testing.
Let's pretend that those poll results aren't baloney in their own right. Let's pretend that the word "higher" means something when applied to standards. None of that means that the Common Core (Bennett carefully skips around how the brand name does in the polls ) is a hit with anyone. I can say that I am really hungry and would like to eat, but if you bring me a plate of raw liver covered with fried kale, I will still send it back. "But you said you wanted supper," you might say, but you'd be silly to do so.
Bennett then repeats his titular assertion that the Core are "working," which is yet another very vague rhetorical flourish. Does he have evidence?
In a word, no.
Bennett instead brings up the Achieve Honesty Gap report, a report with all sorts of problems, such as treating NAEP as a benchmark test. Oddly enough for Bennett's argument, the Achieve report also doesn't mention the Common Core, ever. Bennett's point is that the Big Standardized Test results are getting more in line with NAEP results. This assumes a great many things, not the least of which is that BS Tests are giving us a real measure of how Core-tastic students are, but since there are many parts of the Core that will never be on the BS Test (collaborative learning, reading full works, and critical thinking, for starters), it seems unlikely that the Core tests are even measuring what they claim to intend to measure.
But Bennett's baloney-fest isn't over.
Christie has every right to call for a review of the standards in New Jersey, in fact, most states review their standards every few years anyway.
(Yes, Bennett seems to want to mostly spank Christie in this piece). Bennett is also conveniently forgetting that the Common Core Standards were carefully constructed NOT to be reviewed every few years or even ever. Set in cement, copyrighted, and with states pledged to add no more than fifteen percent and to change not a whit or tittle, the Core also had no mechanism in place for review or revisit, and the architects left the scene quickly for pricey new gigs.
Given the Core pushback and the lack of any authoritative body to oversee anything, the copyright issue has evaporated. But CCSS was designed not to change a bit, and certainly not to be reviewed by the states. In that sentence, Bennett himself has made the case for dropping the Core.
Bennett also invokes the doctrine of Core Inevitability, a sort of sour grapes argument that says, "Fine, make your own standards. But they will inevitably look like the Common Core because CCSS is so close to the Platonic ideal of education standards that all standards must be a pale shadow of the Core awesomeness." This is a highly charitable and extra-fantastical view of the Common Core Standards, which remain the mediocre, poorly written product of educational amateurs.
Bennett finishes with one more hopeful eruption.
If a state ends up tweaking and renaming the standards, it will be acting in a way that is entirely consistent with how the Common Core was designed to function – as exemplar standards for states to improve and build upon.
Yeah, see above. That is very specifically NOT how the Common Core was designed to function. States were forbidden to improve or build upon CCSS. Bennett is entitled to be bitter and disappointed that same political winds that once filled CCSS sails have now deserted the SS Common Core. He is not entitled to pretend that the SS Common Core was built to be some sort of mighty, nimble ocean vessel when in fact it was always, from day one, a wobbly, leaky dinghy with a brick for a rudder.
Showing posts with label NAEP. Show all posts
Showing posts with label NAEP. Show all posts
Friday, July 24, 2015
Thursday, May 21, 2015
Is NAEP Really a Benchmark?
The recent Achieve study (the one with the Honesty Gap) is just the most recent example of someone using the NAEP (National Assessment of Educational Progress) as a benchmark test, as the gold standard of Sorting Students Out.
But not everybody agrees that the NAEP (aka "the nation's report card) is a good measure of, well, anything. Google "NAEP fundamentally flawed" (people seem to love that Fundamentally Flawed verbage when discussing NAEP) and you'll find lots to chew on.
Most debate centers around the leveling of the test. Folks don't care for how they're set. Many critics find them to be irrationally high. In 1999, the National Academy of Sciences released Grading the Nation's Report Card. I can't direct you to a free copy of that to read, but I can summarize second-hand the basic arguments brought against the NAEP.
1) The results don't match the results of AP testing, finding fewer top students than the AP test does.
2) The NAEP gives more weight to open-ended questions.
3) The cut score lines are drawn in vague and hard-to-justify ways. NAS specifies this down to "You can't tell whether a kid just over the line will or won't answer a particular question correctly.
These arguments are not perfectly convincing. The Center for Public Education found, for instance, that NAEP's people had a pretty clear idea of how they were setting achievement levels.
A more damning report came from NCES way back in 2007, in turn looking back at students and test results in the nineties. That time span allowed researchers to do what folks looking at PARCC or SBAC still have not done-- follow up on later successes from the students. Here's a look at what the class of 1992 had done by the time eight years had passed.
Note that 50% of students judged Basic went to college and earned a degree. It's almost as if they were, in fact, college and career ready. And in fact that is a frequent complaint about NAEP level setting-- that their "Basic" is everybody else's idea of "Proficient." Which would certainly explain the finding that state tests find far more proficient students than the NAEP does.
By 2009, deep into the reformy swamp, the government asked for another audit of NAEP, and got this report from the Buros Institute at the University of Nebraska. The report had some issues with NAEP as well:
1) No real validity framework, meaning no real framework for determining what the test actually measures nor what the data from the test can actually be used for.
2) The fact that no other tests, including various state tests, found the same results. This suggests that either NAEP has a singular unmatched vision, or it's out of whack.
3) There's no demonstration of alignment between NAEP and state standards and tests, which means using the test for matters such as, say, Achieve's Honesty Gap study, has no basis.
4) All this means that many "stakeholders" don't really know what they're looking at or talking about when it comes to NAEP scores.
My conclusion? The NAEP, like all other standardized tests, best functions as a measure of how well students do at the task of taking this particular standardized test. As soon as you start trying to figure out anything else based on the test results, you're in trouble. That includes writing fancy reports in which you suggest that states have an honesty gap.
But not everybody agrees that the NAEP (aka "the nation's report card) is a good measure of, well, anything. Google "NAEP fundamentally flawed" (people seem to love that Fundamentally Flawed verbage when discussing NAEP) and you'll find lots to chew on.
Most debate centers around the leveling of the test. Folks don't care for how they're set. Many critics find them to be irrationally high. In 1999, the National Academy of Sciences released Grading the Nation's Report Card. I can't direct you to a free copy of that to read, but I can summarize second-hand the basic arguments brought against the NAEP.
1) The results don't match the results of AP testing, finding fewer top students than the AP test does.
2) The NAEP gives more weight to open-ended questions.
3) The cut score lines are drawn in vague and hard-to-justify ways. NAS specifies this down to "You can't tell whether a kid just over the line will or won't answer a particular question correctly.
These arguments are not perfectly convincing. The Center for Public Education found, for instance, that NAEP's people had a pretty clear idea of how they were setting achievement levels.
A more damning report came from NCES way back in 2007, in turn looking back at students and test results in the nineties. That time span allowed researchers to do what folks looking at PARCC or SBAC still have not done-- follow up on later successes from the students. Here's a look at what the class of 1992 had done by the time eight years had passed.
NAEP Score
|
No Degree
|
Certificate
|
Assoc. Degree
|
Bachelor’s degree or higher
|
Below Basic
|
61.6
|
9.9
|
10.5
|
18.0
|
Basic
|
37.7
|
3.8
|
9.0
|
49.5
|
Proficient
|
18.1
|
0.4
|
2.5
|
79.0
|
Advanced
|
7.5
|
0.2
|
1.3
|
91.1
|
Note that 50% of students judged Basic went to college and earned a degree. It's almost as if they were, in fact, college and career ready. And in fact that is a frequent complaint about NAEP level setting-- that their "Basic" is everybody else's idea of "Proficient." Which would certainly explain the finding that state tests find far more proficient students than the NAEP does.
By 2009, deep into the reformy swamp, the government asked for another audit of NAEP, and got this report from the Buros Institute at the University of Nebraska. The report had some issues with NAEP as well:
1) No real validity framework, meaning no real framework for determining what the test actually measures nor what the data from the test can actually be used for.
2) The fact that no other tests, including various state tests, found the same results. This suggests that either NAEP has a singular unmatched vision, or it's out of whack.
3) There's no demonstration of alignment between NAEP and state standards and tests, which means using the test for matters such as, say, Achieve's Honesty Gap study, has no basis.
4) All this means that many "stakeholders" don't really know what they're looking at or talking about when it comes to NAEP scores.
My conclusion? The NAEP, like all other standardized tests, best functions as a measure of how well students do at the task of taking this particular standardized test. As soon as you start trying to figure out anything else based on the test results, you're in trouble. That includes writing fancy reports in which you suggest that states have an honesty gap.
Saturday, May 16, 2015
How Big Is The Honesty Gap
Sooo many folks are Deeply Concerned about the Honesty Gap. Just check out twitter
Oops! That last tweet was apparently about some other Honesty Gap.
The Gappers are repeatedly expressing concern that parents need to know the truth about how their children are doing, specifically whether or not students are ready for college. Apparently everyone in the world is lying to them. Schools and teachers are lying when they assign grades. Even college letters of acceptance are Big Fat Lies. Everyone is lying-- the only possible way to know how your child is doing is to have that child take a Big Standardized Test, and not just any BS Test, but one from our friends at PARCC or SBA. Only those profoundly honest tests will do.
I got into a twitter discussion about this because I asked why, if NAEP is the gold standard by which state tests can be measured, why do we need the state test? Because the NAEP only samples, and we need to test every single child so that parents can get feedback. Okay, I asked-- doesn't that mean that the tests are for two different purposes and therefor can't really be compared? No, they can be compared if NAEP disaggregates well. So then why can't we-- well, I don't blame the person on the other end. Trying to have a serious conversation via twitter is like having sex by semaphore.
I gather that proof of state honesty would be more students failing, because once again we have an argument that starts with, "We know states suck at education and that students are doing terribly, so we just need to design an instrument that will catch them sucking." It's so much easier to design the right scientific measure if you already know what the answer is supposed to be.
So where is the actual honesty gap?
Is it where Common Core promoters claim that the standards are internationally benchmarked? Is it when CCSS fans suggest that having educational standards lead to national success? Is it when they decry low US test scores without noting that the US has been at the bottom of international test results as long as such things have existed?
Is the honesty gap in view when these folks say that parents need transparent and clear assessments of their children's standing, but what they mean is the kind of vague, opaque reports proposed? You know-- the report that basically gives the child a grade of A, B, C or D on a test whose questions nobody is allowed to see or discuss? Is the honesty gap cracking open even wider every time somebody suggests that a single math-and-reading test can tell us everything we need to know about a child's readiness for college and career?
Are we looking into the abyss of the gap when advocacy groups fail to mention that they are paid to support testing and the Core, or that they stand to make a ton of money from both? Does the honesty gap yawn widely when these folks fail to state plainly, "We think the world would be a better place if we just did away with public education, and we work hard to help make that happen." Is Arne Duncan's voice echoing hollowly from the depths of Honesty Gap Gulch when he suggests that telling an eight-year-old that she's on the college track either can or should be a thing?
It is ballsy as hell for the reformsters, who have been telling lie after lie to sell the CCSS-testing combo for years (oh, remember those golden days of "teachers totally wrote the Common Core"?), to bring up concerns about honesty. I admire their guts; just not their honesty.
They have a hashtag (because, you know, that's how all the kids get their marketing done these days) and I encourage to use it to add your own observations about where the #HonestyGap actually lies.
Parents and educators deserve accurate data about how their students are performing in the classroom: http://t.co/FeIjLPlwZ4 #HonestyGap
— StudentsFirst (@StudentsFirst) May 14, 2015
.@EvanE4E: Gap between state expectations & NAEP confirms need for rigorous, consistent, clear standards http://t.co/P3WiuEt9a6 #HonestyGap
— Educators4Excellence (@Ed4Excellence) May 14, 2015
States are saying students are “proficient” when they're not actually well prepared. We need to fix the #HonestyGap: http://t.co/JbinzeO3aF
— CAP Education (@EdProgress) May 14, 2015
Awesome Products + Dubious Rewards = Bad Experience http://t.co/MdnibIcIZ7 #SurveySweepstakes #HonestyGap
— Customerville (@customerville) June 12, 2014
Oops! That last tweet was apparently about some other Honesty Gap.
The Gappers are repeatedly expressing concern that parents need to know the truth about how their children are doing, specifically whether or not students are ready for college. Apparently everyone in the world is lying to them. Schools and teachers are lying when they assign grades. Even college letters of acceptance are Big Fat Lies. Everyone is lying-- the only possible way to know how your child is doing is to have that child take a Big Standardized Test, and not just any BS Test, but one from our friends at PARCC or SBA. Only those profoundly honest tests will do.
I got into a twitter discussion about this because I asked why, if NAEP is the gold standard by which state tests can be measured, why do we need the state test? Because the NAEP only samples, and we need to test every single child so that parents can get feedback. Okay, I asked-- doesn't that mean that the tests are for two different purposes and therefor can't really be compared? No, they can be compared if NAEP disaggregates well. So then why can't we-- well, I don't blame the person on the other end. Trying to have a serious conversation via twitter is like having sex by semaphore.
I gather that proof of state honesty would be more students failing, because once again we have an argument that starts with, "We know states suck at education and that students are doing terribly, so we just need to design an instrument that will catch them sucking." It's so much easier to design the right scientific measure if you already know what the answer is supposed to be.
So where is the actual honesty gap?
Is it where Common Core promoters claim that the standards are internationally benchmarked? Is it when CCSS fans suggest that having educational standards lead to national success? Is it when they decry low US test scores without noting that the US has been at the bottom of international test results as long as such things have existed?
Is the honesty gap in view when these folks say that parents need transparent and clear assessments of their children's standing, but what they mean is the kind of vague, opaque reports proposed? You know-- the report that basically gives the child a grade of A, B, C or D on a test whose questions nobody is allowed to see or discuss? Is the honesty gap cracking open even wider every time somebody suggests that a single math-and-reading test can tell us everything we need to know about a child's readiness for college and career?
Are we looking into the abyss of the gap when advocacy groups fail to mention that they are paid to support testing and the Core, or that they stand to make a ton of money from both? Does the honesty gap yawn widely when these folks fail to state plainly, "We think the world would be a better place if we just did away with public education, and we work hard to help make that happen." Is Arne Duncan's voice echoing hollowly from the depths of Honesty Gap Gulch when he suggests that telling an eight-year-old that she's on the college track either can or should be a thing?
It is ballsy as hell for the reformsters, who have been telling lie after lie to sell the CCSS-testing combo for years (oh, remember those golden days of "teachers totally wrote the Common Core"?), to bring up concerns about honesty. I admire their guts; just not their honesty.
They have a hashtag (because, you know, that's how all the kids get their marketing done these days) and I encourage to use it to add your own observations about where the #HonestyGap actually lies.
Friday, May 15, 2015
Honesty: The Hot New Gap (With Anti-CCSS Bonus)
A new report from Achieve.org doesn't provide a lot of information, but it has opened up a great talking point Gap-- ladies and gentlemen, may we introduce the Honesty Gap!
The report, "Proficient vs. Prepared: Disparities between State Tests and the 2013 National Assessment of Educational Progress (NAEP)" -- well, actually, that title pretty well covers it. Achieve compared Big Standardized Test results to NAEP results.
Achieve, you may recall, was one of the groups instrumental in creating Common Core and foisting it on American schools. So we can't be surprised when their stance is somewhat less than objective.
Today’s economy demands that all young people develop high-level literacy, quantitative reasoning, problem solving, communication, and collaboration skills, all grounded in a rigorous and content-rich K-12 curriculum. Acquiring these skills ensures that high school graduates are academically prepared to pursue the future of their choosing.
Two sentences. The first one sounds lovely, if rather limited, and is an opinion that I'm sure many folks share (at least in part). The second is another iteration of the unproven belief that such a list of qualities will lead to academic preparation. But then, in the next sentence, in bold typeface-- we make a huge, huge leap.
Many state tests, however, continue to mislead the public about whether students are proficient. Parents, students, and teachers deserve transparency and accuracy in public reporting.
This statement assumes and implies that "proficient" is a measure of students development of the list above. It is not. It is a score from one badly designed, non-validated Big Standardized Test that does not have a hope of measuring any of those high function skills (not to mention "collaboration," which is of course expressly forbidden).
I do like the call for transparency. Does this mean that Achieve is going to call for an end to the Giant Cone of Secrecy around the test, and that states should no longer be required to serve as enforcement arms for protecting the proprietary rights of test manufacturers over the educational interests of students? No, I didn't think so.
BS Tests are measuring tools that have never been checked. It's like somebody holds up a length of string and says, "Yeah, that is what I imagine a yard should be, more or less" without ever grabbing a yardstick. Now, Achieve is shocked-- shocked!!-- to discover that the various states' pieces of string aren't exactly a yard long.
But their framing of it is, well, exquisite. States that have BS Test scores that come (somehow) in line with their NAEP scores are called the Top Truth-Tellers. The big gap states are not called Top Dirty Rotten Liars, but hey, if the shoe fits. This raises a few questions, such as how one compares the state-level BS Tests with the NAEP (maybe, it seems, just by counting the number who pass or fail).
More importantly, it raises this question: if the NAEP is the gold standard for measuring all that cool stuff about student achievement, why don't we just use it and scrap all the state-level BS Tests?
Reformsters are skipping right past that to The Honesty Gap. It's a more formal version of the old assertion that schools and teachers are just lying to their students and ed reform has to include telling parents and students that they and their schools and their teachers all suck.
Not surprisingly, the Honesty Gap has shown up in pieces by Mike Petrilli at Fordham and at the Reformster Website To Which I Will Not Link. And those pieces are not a surprise because the Honesty Gap has recently launched its very own website!! Woo hoo!! That website was launched by The Collaborative for Student Success, an advocacy group with most excellently reformy partners,
including the Fordham Foundation, the US Chamber, and even-- oh, look! Also Achievethecore.org. All of which explains why Honesty Gap uses much of the same rhetoric to highlight the data from the Achieve.org report.
[Update: Oh, wow. The full-scale product rollout includes a new hashtag #HonestyGap on twitter, where you can find all your favorite reformy hucksters tweeting about how parents deserve the truth!]
Man-- it's like the group is so loaded with money that every time they wan t to launch a new talking point, they give it its own glitzy website. Meanwhile, I am typing about it while eating my convenience store fiesta chicken wrap at lunch. It's an amazing world.
So what's the end game of this particular self-supporting PR blitz? Maybe the secret is here in the third of the Achieve report's "findings"--
A number of states have been working to address proficiency gaps; this year, even more will do so by administering the college- and career-ready-aligned Smarter Balanced and PARCC assessments.
The dream of a national assessment, a BS Test that waves its flag from shore to shore-- that dream still lives! See, states? You insisted on launching your own test and dropping out of PARCC/SBA and that's just cause you're lying liars who lie the giant big lies. Come back home to the warm bosom of a giant, national scale test!
Here's one funny thing about the Achieve report. There's a term that does turn up on the Honesty Gap website, but in twelve pages of the original Achieve report about being prepared and proficient etc etc, these words do not appear once-- Common Core.
It's funny. Even a year ago, I hated the Core pretty passionately. But I start to feel sorry for it-- given the need to choose between Core and charters, Core and political advantage, or Core and testing, people keep picking the Core last. Poor orphaned useless piece of junk.
The report, "Proficient vs. Prepared: Disparities between State Tests and the 2013 National Assessment of Educational Progress (NAEP)" -- well, actually, that title pretty well covers it. Achieve compared Big Standardized Test results to NAEP results.
Achieve, you may recall, was one of the groups instrumental in creating Common Core and foisting it on American schools. So we can't be surprised when their stance is somewhat less than objective.
Today’s economy demands that all young people develop high-level literacy, quantitative reasoning, problem solving, communication, and collaboration skills, all grounded in a rigorous and content-rich K-12 curriculum. Acquiring these skills ensures that high school graduates are academically prepared to pursue the future of their choosing.
Two sentences. The first one sounds lovely, if rather limited, and is an opinion that I'm sure many folks share (at least in part). The second is another iteration of the unproven belief that such a list of qualities will lead to academic preparation. But then, in the next sentence, in bold typeface-- we make a huge, huge leap.
Many state tests, however, continue to mislead the public about whether students are proficient. Parents, students, and teachers deserve transparency and accuracy in public reporting.
This statement assumes and implies that "proficient" is a measure of students development of the list above. It is not. It is a score from one badly designed, non-validated Big Standardized Test that does not have a hope of measuring any of those high function skills (not to mention "collaboration," which is of course expressly forbidden).
I do like the call for transparency. Does this mean that Achieve is going to call for an end to the Giant Cone of Secrecy around the test, and that states should no longer be required to serve as enforcement arms for protecting the proprietary rights of test manufacturers over the educational interests of students? No, I didn't think so.
BS Tests are measuring tools that have never been checked. It's like somebody holds up a length of string and says, "Yeah, that is what I imagine a yard should be, more or less" without ever grabbing a yardstick. Now, Achieve is shocked-- shocked!!-- to discover that the various states' pieces of string aren't exactly a yard long.
But their framing of it is, well, exquisite. States that have BS Test scores that come (somehow) in line with their NAEP scores are called the Top Truth-Tellers. The big gap states are not called Top Dirty Rotten Liars, but hey, if the shoe fits. This raises a few questions, such as how one compares the state-level BS Tests with the NAEP (maybe, it seems, just by counting the number who pass or fail).
More importantly, it raises this question: if the NAEP is the gold standard for measuring all that cool stuff about student achievement, why don't we just use it and scrap all the state-level BS Tests?
Reformsters are skipping right past that to The Honesty Gap. It's a more formal version of the old assertion that schools and teachers are just lying to their students and ed reform has to include telling parents and students that they and their schools and their teachers all suck.
Not surprisingly, the Honesty Gap has shown up in pieces by Mike Petrilli at Fordham and at the Reformster Website To Which I Will Not Link. And those pieces are not a surprise because the Honesty Gap has recently launched its very own website!! Woo hoo!! That website was launched by The Collaborative for Student Success, an advocacy group with most excellently reformy partners,
including the Fordham Foundation, the US Chamber, and even-- oh, look! Also Achievethecore.org. All of which explains why Honesty Gap uses much of the same rhetoric to highlight the data from the Achieve.org report.
[Update: Oh, wow. The full-scale product rollout includes a new hashtag #HonestyGap on twitter, where you can find all your favorite reformy hucksters tweeting about how parents deserve the truth!]
Man-- it's like the group is so loaded with money that every time they wan t to launch a new talking point, they give it its own glitzy website. Meanwhile, I am typing about it while eating my convenience store fiesta chicken wrap at lunch. It's an amazing world.
So what's the end game of this particular self-supporting PR blitz? Maybe the secret is here in the third of the Achieve report's "findings"--
A number of states have been working to address proficiency gaps; this year, even more will do so by administering the college- and career-ready-aligned Smarter Balanced and PARCC assessments.
The dream of a national assessment, a BS Test that waves its flag from shore to shore-- that dream still lives! See, states? You insisted on launching your own test and dropping out of PARCC/SBA and that's just cause you're lying liars who lie the giant big lies. Come back home to the warm bosom of a giant, national scale test!
Here's one funny thing about the Achieve report. There's a term that does turn up on the Honesty Gap website, but in twelve pages of the original Achieve report about being prepared and proficient etc etc, these words do not appear once-- Common Core.
It's funny. Even a year ago, I hated the Core pretty passionately. But I start to feel sorry for it-- given the need to choose between Core and charters, Core and political advantage, or Core and testing, people keep picking the Core last. Poor orphaned useless piece of junk.
Subscribe to:
Posts (Atom)