Showing posts sorted by date for query AI. Sort by relevance Show all posts
Showing posts sorted by date for query AI. Sort by relevance Show all posts

Sunday, May 10, 2026

ICYMI: Mothers' Day 2026 Edition (5/10)

A Happy Mothers' Day to those who are able to celebrate. There are many reasons not to want to join in today, and if that's you, then may the day pass with a minimum of hurt. 

Here's your reading list for the week. Remember-- if it speaks to you, share it.

Teachers! Appreciate Them Now. They Soon Could Be Gone!

Wrap up Teacher Appreciation Week with Nancy Bailey's list of the many ways that corporate reformsters have tried to make the lives of teachers miserable.

Is Moms for Liberty paying local chapters to attend monthly Zoom calls?

Moms for Liberty continue to fudge the definition of "grass roots organization." Kate LaGrone at WPTV reports that the Moms need a little help getting people to show up for meetings.

Voucher programs fail rural schools

The Economic Policy Institute explains why taxpayer-funded voucher programs are especially hard on rural school districts.

Congress Is Broken and Unpopular: Here Are 12 Reforms Children and Families Need

Bruce Lesley outlines some reforms that the current lousy version of Congress could pass to make life better for children and families.

The Blueprint for American Censorship

Mrs. Frazzled with some useful history on book banning in the USA, including Anthony Comstock.

Bill Lee's School Funding Formula Leads Tennessee to the Bottom in School Funding

When Tennessee's not busy making sure that voters in Memphis don't matter, they are making sure that they spend as little as possible on schools. Andy Spears explains.

Another death in the AI-in-education family

One more crappy AI educational "aid" appears to have bitten the dust. So long, OpenAI Study Mode. Benjamin Riley has the details.


Thomas Ultican is bugged by all the billionaires still trying to sell vouchers.

One Big Beautiful Bill’s Child Tax Credit Still Leaves America’s Poorest Children Far Behind

It's been a year and we still haven't tracked all the ways the big dumb bill is sticking it to the non-wealthy. Jan Resseger explores some of them.

Fabricated citations: an audit across 2·5 million biomedical papers

A research paper at The Lancet. And yikes.


From Mark Paglia at McSweeney's. “Simply wearing a small red letter A is no great burden, and it would infringe upon the free speech of the rest of the town were Hester Prynne not to wear it.”

It was a busy week at Forbes.com


- A complaint about the use of "student achievement" when what researchers mean is "test scores"

My mother always liked this song when we were all younger, so this goes out to her today.



Subscribe for free!

Sunday, May 3, 2026

ICYMI: Essay Contest Edition (5/3)

 Once a year, I'm the director of a local writing competition for high school students in the various school districts of the county. The competition is in honor of one of the giants of English teaching in our area; she graduated from here, worked in the original OSS, became a lady CEO, taught English, and left the classroom only because there was such a thing as a mandatory retirement age (you can read about her here). 

The contest has run for thirty-some years, and it is precisely the sort of thing that cheatbots make challenging, though historically our winners write way better than bots do, and I work hard to design a bot-resistant prompt. But it's a fun time for me-- part of my duties include being first reader and culling the hundreds of entries down to a manageable stack for table judges. 

So that has been my week. But I still have a reading list for you. 

The Atlantic Platforms Charter School Propaganda: Anti-Woke Edition

Paul Thomas responds to the Atlantic piece about how awesome charters are and how anti-racism is killing public schools. 

Oligarchs and Christian Nationalists Aim to Plunder Massachusetts Public Schools

Maurice Cunningham peels back the masks on another Massachusetts assault on public education, and reminds us that National Parents Union is none of those three things.

AI gives more praise, less criticism to Black students

Lots of implications to mull over in this finding, written up by Jill Barshay at Hechinger Report.


Center on Budget and Policy Priorities has a nifty bar chart that lays out in quick and simple manner where the taxpayer-funded vouchers are actually going.

Epic founders Harris, Chaney bound for criminal trial as 2-year preliminary hearing ends

One of the nation's major charter school scams might actually result in jail time for the scammers who pocketed $22 million of taxpayer dollars in their massive fraud.

Why We Are Suing the Department of Education

It's not just that the Office of Civil Rights in the Education Department has decided only the civil rights of white guys are being threatened-- it's that they're being anti-transparent about what they are and are not doing. ProPublica has sued, and here explains why.


Don't know how I missed this last week, but this New Yorker piece from Jessica Winter is well worth the read (if you can get to it).

The Big Tech Backlash

Jennifer Berkshire looks at some of the pushback against ed tech, including some of the surprising places it's turning up.

We Created the Lotus Eaters

Matt Brady writes about the students who are comfortable non-starters, and how to get them back into work.

I Write the Songs

On songwriting, music teaching, and mistakes. From Nancy Flanagan.

Broken Record

Audrey Watters finds herself writing about the same stuff, again, again. And yet, it is stuff that needs to be said, again, again.

Seniors and Kids as Profit Centers: Medicare Advantage and School Vouchers Exploit Both

Bruce Lesley explains how Medicare Advantage and school vouchers are manifesting the same philosophy to harvest profits (and provide minimum service).

Ohioans: Please Do Not Sign Petition to Get Referendum to End Property Taxes on the November Ballot

Jan Resseger has an important message for folks in Ohio.

Standardized testing and scripted lessons are failing both teachers and students

Johnathan Kantrowitz is talking about Australia in this post, but some of the description sure sounds familiar (including panic over declining test scores).

The Testing Ritual and the Steakhouse Reality

Testing, staffing, and working lunches-- TC Weber looks at it all with one raised eyebrow and more than a few questions.


There has been a lot of noise and wrestling about with the New York City schools' attempt to craft AI guidance, and while I don't generally look to NYC for guidance on anything, these five objections from Leonie Haimson are an excellent guide to the sort of questions you should be asking about your local school district's attempt to cope with AI. If you want more, Chalkbeat covers the parent rebellion here.

Kent State President claps back at Vivek. It's about damn time.

A university leader actually calls out a politician's dumb ideas. More of this, please. Stephen Dyer has the details.

At Forbes.com this week, I wrote about some important characteristics of rural schools

I don't love the Black Eyed Peas, but I do like an unexpected team-up.


Monday, April 27, 2026

AI Is Not For Amateurs

Ben Riley has pulled a lot of attention lately for the story of his father, who turned to AI for advice on how to manage his cancer, and died because of it. Riley gets into the experience of being a New York Times story subject in a recent post, and looks into the reporters idea to show oncologists the advice the AI was providing. Riley shares their responses, and even for AI, it is shockingly, horribly wrong.

A trained cancer doctor would recognize that it was nonsense. An amateur might be fooled by how AI manages to mimic the look and feel of s real medical report.


This points to a recurring theme in AI use. The "human in the loop" principle is all about including a human being who can actually understand--and check-- the AI output. Or consider one of the more popular AI assignments for students-- have a LLM write about a topic you know well, and count up all the mistakes it makes. In other words, experts.

Large Language Models can perfectly mimic form and confidence. They have, literally, no shame, less than even the most shameless bullshit artist that ever sold you some Florida real estate or a White House super-duper ballroom. They are elegantly mechnized Dunning-Kruger machines. 

I recently sat and talked to someone who works in the computer tech and coding world and describes himself as a power user of AI. AI does save him and his team time, but there are caveats. AI doesn't remember what it has done. "It's like talking to a smart person with Alzheimer's." And it is not trustworthy. The project has to be broken down into chunks, and then each chunk has to be run through testing, designed by and/or involving a human coder in order to determine if the code actually does what it is supposed to do. The resulting process is still faster than the old all-human approach, but it still requires the involvement of humans with expertise to check the work, go back, re-do, check again, and on and on. It is most definitely not "Press a button and an hour later a fully-completed project is ready to go."

The conversation raised lots of questions for me. If the AI is doing all the entry-level grunt work under the watchful expert eye of human accountability sinks, then where will the future expert eyes come from? 

I'm also thinking of all those folks happily burbling "I use AI to write my journalism-flavored content" or "I use AI to write my lesson plans," and wondering if their process looks similar, if they are taking the bot through building up a lesson plan step by step, carefully examining each product every step of the way with their own expert eyes. Because I'm betting not. 

Because while coding involves a lot of time-intensive grunt work hours that can be collapsed by AI, writing things does not. Doing the thinking work (outlines, brainstorming, etc) is how you get ready for the writing work, and that includes writing a lesson plan. If you have the AI write the outline, you still have to do the thinking part. In short, if you use the bot to write your lesson plan in a responsible, professional manner, I don't see it saving you any amount of time.

In fact, if you really are an expert, I'm betting lesson plans or writing by bot, if done well, will actually take more time than just doing it yourself. The people who are finding it botting their way through the work are, just like the students using cheatbots, the folks least qualified to use the bot without producing junk. 

It is the central irony of AI is that it's really only safe to use if you are already an expert in your field. And that's a terrifying thought when you consider that AI has the potential to completely gut the pipeline that would ordinarily produce experts. 

Mind you, expertise is not a guarantee of well-used bots. AI repeatedly encourages users to trust its illusory expertise. Last week CNN reported that a top-ranked lawyer at "one of the most prestigious firms on the planet" became the latest in a long string of lawyers tripped up by AI error. He had to send a letter of apology to a judge after submitting a filing loaded with errors-- it took three pages to highlight and correct all of them. The mistakes were caught by opposing counsel. 

All of this underlines one clear idea-- of all the people who shouldnt be using AI, students shouldn't be using it the most. Jessica Winters, in her recent New Yorker article, cites a host of experts who point out the many ways that AI is not a useful, appropriate, or even safe tech to include in education. But it is already oushed heavily in all manner of K-12 education. 

The Chromebooks, which the students use in every class and for homework, came pre-installed with an all-ages version of Gemini, a suite of A.I. tools. When my daughter, who is in sixth grade, begins writing an essay, she gets a prompt: “Help me write.” If she is starting work on a slide-show presentation, the prompt is “Help me visualize.” She shoos away these interruptions, but they persist: “Help me edit.” “Beautify this slide.” The image generator is there, if she’d ever wish to pull the plug on her imagination.

There are so many reasons to keep AI away from students. At the very least, we should be replacing all the cute little "become an AI expert" lesson plans helpfully provided by AI corporations with lessons about what AI is not and can not do, and nwhy children should avoid it like they avoid strangers in vans offering them candy. 

Winters asks what it will take to push AI out of schools, and the answer, I think, is a whole hell of a lot because a lot of very powerful people have bet a very large amount of money that they can push AI everywhere, regardless of what harms it will do. It is as if the wealthiest corporations in the world have bought a vast supply of very powerful crack and they now are desperate to move it into any market they can think of.

AI is not for amateurs in any field, and I only grudgingly accept that in some forms, it may have some use for some experts. In education, I think it will be awesome for cranking out lesson plans that administrators demand but don't read and teachers generate but don't use. For anything else, educators had better be prepared to use it like grown-ass experts in their field and not like a 14-year-old trying to generate a term paper ten minutes before it is due. And if using it like an expert in your field turns out to create a process that is longer and less productive than the non-AI version, well, experts should know how to get the job done.

 

Sunday, April 26, 2026

ICYMI: Soccer Edition (4/26)

The Board of Directors is trying soccer this spring and their first match was yesterday, in the rain. They have not yet revealed any special aptitude for the game, but it does involve a lot of running hard up and down a field, and that is their preferred sporty activity. It gets us all outside and moving around while breathing air and touching grass and just generally interacting with real things and other humans, and that seems like rather a huge win. 

We have been a low-screen household since they boys were born. They have no phone, no tablet, little tech at all, and watch only a tiny bit of tv. Most of their screen time happens, as you might guess, at school. I'm at peace with that, for now, because they do need some basic computer literacy to deal with the world, and confining it to school seems like an easy way to put guardrails around it. We'll see if my old district (where the board attends school) will get more restrictive about this stuff.

The hard part of a school's tech policy is parents, so I am hoping that we don't-give-my-kid-a-phone parents will be growing in numbers (because if you want your child's school to have a policy restricting smart phone use, you could help by not giving your kid a smart phone). 

Here's the reading list for the week. Enjoy it in good health.

School Vouchers Fail the Civil Rights Test. The Federal Program Is No Exception

The 74 invited some folks to write a response to a Derrell Bradford piece plugging the federal voucher program. Jenny Muniz, Nicole Fuller, Ashley Harrington, and Hal Smith replied with this piece that absolutely nails the point contained in this sentence--
“Choice” is a compelling slogan, but with private school vouchers, it’s the school’s choice, not the families.
The Blue State Voucher Express

Jennifer Berkshire notes that Arne Duncan and the usual gang of reform-loving nominally-Democrat privatizers have decided to shill for Donald Trump's federal voucher program. Shame on the lot of them. She writes, "Ten years later, they’re back, armed with another pig and plenty of lipstick."

Public Schools Form Democratic Citizens

Jan Resseger looks at a paper from education and law scholar Derek Black.
 
The America We Choose: Reclaiming the Promise of K–12 Public Education

Greg Wyman examines some of the classic pendulum swings and what the pendulum is doing to public education right now. 

Anti-Property Tax Issue Proponents are either extremely dumb or extremely deceitful

Well, Stephen Dyer is pretty sure they're dumb as rocks, and he uses some colorful language to explain why the guys trying to get rid of Ohio's property tax are absolutely and spectacularly in the wrong.

What’s Behind the Push to Make Schools Adopt the Science of Reading?

Rachael Gabriel is a professor of Literacy Education at U of Connecticut and co-editor-in-chief of The Reading Teacher, so it's likely that she knows what the heck she is talking about, which puts her ahead of so many people pushing the science of reading these days. So go ahead- read one more piece about SoR. This one's at The Progressive.

Privatizers Hijack Indianapolis Public Schools

I did cover this story, but let Shawgi Tell zero in on it from another angle.

Local entrepreneurs cashing in on state funds from homeschool parents

Oh so many ways to cash in on Florida's voucher program.

‘Schools of Hope’ charter operator is moving into 5 Miami-Dade high schools

Speaking of Floridian grift, don't forget Schools of Hope, the program that allows charters schools to just take buildings from the public school system. It was supposed to only affect the low-achieving public schools, but-- surprise!

If It's About Volcanoes, Teach Volcanoes

Lauren Brown offers ideas about favoring content over the vagueness of teaching "reading skills." Not sure I agree with every single word of this, but it's worth thinking about.

Bloodbath at Mark Zuckerberg-backed California school as tech titan and his wife strip funding

Well, there's no actual bloodbath, but this is the New York Post coverage of Episode #1,659,437 of Why Education Should Not Depend Upon The Kindness of Rich Guys.

Tennessee rolls back testing requirements in early voucher program

Look, no school should be held "accountable" via Big Standardized Testing. But Tennessee lawmakers decided that since tests weren't showing voucher schools to be doing better than, or even as well as, public schools, the solution was to just not make voucher schools take the test. So much for accountability via an informed free market.

A school program got millions in welfare linked dollars and now officials want answers

Hats off to Star Academy, a for-profit company run by John Alvendia. They figured out how to run an education scam and a welfare scam simultaneously!


Thomas Ultican talks about the need to avoid AI, and quotes some other folks, including Benjamin Riley, a "uniquely free thinker."

Pivoting Edtech Towards Humanity

Dan Meyer writes about the misalignment between humanity and edtech companies, as well as the misalignments between people who want to teach and people who want to learn.


George Evans reached out to me to say that he had written something I might like to read and by damn he was right. This is a layered essay about the reach of teachers and the ways things come back to us later. I am always happy to find new writers that I hadn't previously found and wish more folks would send me recommendations, self- or otherwise. One thing about the interwebs and the people who write about education on them is that those writers tend to cycle through quickly. Of the people I was reading and sharing with umpty-ump years ago, only a handful are still at it. So I'm always excited to meet new folks.

The Silent Surrender of Moms for Liberty Anchorage

Mathew Beck reports that one more Moms for Liberty chapter has quietly expired, this time in Anchorage, Alaska. Thoughts and prayers.

Why Is Lower Merion School District Ignoring Its Own Technology Policy?

James Horn reports on the Pennsylvania school district that has decided that students may not opt out of screens-- even though they have an opt-out policy.

How to Manufacture Crisis with Line Charts: NAEP Reading Edition

Paul Thomas shows us how to make everyone freak out with a chart (even if the chart isn't really very freak outable). With pictures.

Why are we holding third graders back in school?

Steve Nuzum looks at the problems with third grade retention, a policy that won't go away no matter how many times the problems are demonstrated.

America’s Students Need Great Public Schools for Science!

Wasn't it cool when the astronauts did that astronaut stuff? Nancy Bailey reminds us that those astronauts didn't just fall of an astronaut tree.

Kids don’t use augmented reality like adults, raising concerns for classrooms

Jonathan Kantrowitz reminds us that having small humans use devices designed for grown humans opens the door to all sorts of problems. So maybe let's rethink using those AR headsets with third graders.

This Scammer Used an AI-Generated MAGA Girl to Grift ‘Super Dumb’ Men

Have you heard of Emily Hart, the nurse who loves God, guns, and making "illegals" go away? She's a darling of the MAGA crowd, with a huge online following. And she doesn't exist, but she is helping put through school the 22-year-old med student who created her. From EJ Dickson at Wired.


Noah Hawley writing for The Atlantic tells of his strange encounter with the very rich. And while it's all worth reading, there is this--
It’s not that the wealthy become evil; it’s that their environment stops teaching them the things that nonwealthy people are forced to learn simply by living in a world that pushes back. When you can buy your way out of any mistake, when you can fire anyone who disagrees with you, when your social circle consists entirely of people who need something from you, the basic mechanism by which humans learn that other people are real goes dark.

This week at Forbes.com I looked to Ohio, where one more school board wants folks to understand that hate does, in fact, have a home in their district. And they're getting sued for it.  

Trombones and Danny Elfman-- what else could a person need. 



Subscriptions are free now and forever. Well, probably not forever. But as long as I'm alive and doing this. 

Friday, April 24, 2026

Los Angeles Resolves To Reduce Student Screen Time

It took over a decade, but the Los Angeles Unified School District may finally be getting smart about computer tech in classrooms.

It was way back in 2014 that John Deasy worked up a cozy deal with Pearson and Apple to spend taxpayer dollars on iPads for LAUSD students. Only he got caught striking a sweetheart deal for products so awful, so lacking in promised software, that it lost him his job. Deasy was a graduate of Eli Broad's Fake Grad School of Ed Management, the ultimate "treat schools like a business" training backed by a guy who in 2016 decided to just take over the district and privatize the whole thing.

Then there was that time in 2017 when the district decided they'd drop $80 or $90 on a bunch of cool software, including a program (I am not making this up) recommended because it was a big hit in Uruguay. 

But this school year something happened. Maybe it was the tide shifting nationally, maybe parents had had enough. Maybe the fact that LAUSD followed the national trend and completely banned cell hones from classes. But parents started putting pressure on the district. 

Actually, I could believe it was the device ban. It took effect in February 2015, and it was one of the toughest ones in the country-- no personal devices, including not just smart phones but also smart watches. And that ban just made the presence of the iPads stick out. As NBC quoted one parent, who found their child was using the school-issued iPad to watch Youtube and play Fortnite--

“It makes no sense to me,” Byock said. “We’ve banned the cellphones, but it doesn’t matter, because the kids are using the school-issued devices in exactly the same way.”

Exactly (Also, if you haven't come across this one yet, students years ago figured out that you can "chat" via a shared Google document).  

The district doesn't have a policy in place yet, but they have passed a resolution to get it done. Minimize the screen time, eliminate it entirely for earliest grades, encourage the use of paper and pen assignments. It states in part:

While access to and developing skills in technology are critical in a digital world, excessive screen time can be associated with vision problems, increased anxiety and depression, addictive behavior, reduced attention span, difficulty managing emotions, lower academic achievement, and weaker cognition, according to the American Academy of Pediatrics. A growing body of research indicates that excessive and unstructured screen use can negatively impact student attention, mental health, and overall wellbeing and can be particularly harmful for younger students. Research indicates that children 8 to 11 years old who exceed screen time recommendations are at higher risk for obesity and depressive symptoms and have scored lower on cognitive assessments.

The ability to stand up to tech companies and computer-related FOMO is coming just in time, just as Silicon Valley is hell bent on trying to convince us that schools should be packed to the rafters with awesome AI crap. It's inevitable! You don't want your students to be left out! 

Well, yes. I kind of do. It is ironic that LAUSD, with its history of boneheaded tech moves, is the first major district to make a conscious attempt to dial back the screens for students. But it would be great if they were the leading edge of a new wave that minimized screen time for the next generation. 


Monday, April 20, 2026

A Big False Assumption About AI In Schools

Over on the dead bird app, you can find the AI Being Dumb account, an invaluable source. It recently highlighted the Andon Labs experiment in letting AI run a store. Two items of note. One is that the AI hired humans to do some of the work (though it didn't tell them when to show up). The other is that folks started using Google Reviews to try to get the store to stock products, like $260,000 worth of paper clips, tungsten metal cubes, barrels of oil, or 413,793 KitKats. 

This highlights one of the assumptions of every discussion about AI tutors and AI paper graders and AIs in place of humans in education. The assumption is that once we replace the human actor with an AI agent, everyone else will keep interacting with the AI agent as if they were still dealing with the human. 

That's a silly assumption, particularly in a school setting. Students do not even treat humans like other humans. Part of September is the annual Testing Of The Classroom Boundaries as well as te annual Mapping Of The Expectations. Students conduct these activities, sometimes augmented by the Existing Reputations of the adult humans, and use the collected data to make their choices for the remainder of the year. All of this testing and mapping is conducted withing each student's personal rules for how one treats other human beings.

This is part of the rich web of human relationships that support and enrich education. The AI-in-education crowd seems to think that one can swap out any human node of that web and replace it with a bot and nothing important will change.

For the moment, I don't want to focus on the dehumanizing of a human activity and dynamic. I want to focus on this question-- how will young humans act when they find themselves educationally yoked to a robot instead of a human. Expect a couple of effects.

Erasing ethical boundaries. Most humans operate on the assumptions that we owe other humans a good-faith attempt to communicate honestly. Yes, lots of people violate that assumption, but the fact that te boundary exists is why we have a whole language about lies and dishonesty that describes the transgressive nature of not making that good-faith, honest effort. But what do we owe a bot? Is there any reason to make a good-faith honest human effort in responding to or interacting with a non-human bot?

This may seem like esoteric philosophical noodling that young humans would not waste a minute pondering, but I assure you they get it on some level. Why do schools spend so much time hooting and hollering at the onset of Big Standardized Test season, trying to connect the test to students' relationships with their teacher and schools? Because students on level understand that they don't owe any good-faith honest effort to whatever faceless unknown buraucrats are behind the BS Test, so schools figure they'd better activate student's connection to teachers and school. "I know you don't owe it to Pearson or education reformsters to give this an honest try, but how about doing it for Mrs. Swellclass and the East Egg Battling Chickens?"

Do you think a student will give the same size and shape of effort to a bot that they would give to their beloved human teacher, or even their sort-of-don't-mind human teacher? Some will decide to see how entertaining bad-faith efforts can be; what kind of baloney will the bot accept? All will figure out how to deal with the bot-generated pressure to create human-crafted AI slop. They may fight back, give in, try to outsmart the bot, but only a few will keep trying to do their best as if they were working for a human.

It is worst for any instruction or assessment that involves writing. Writing is impervious to objective evaluation; everyone who grades writing assignments does so with their own set of biases in place. Another AI falsehood applies here; decades of fiction and years of marketing have primed us to think of robot intelligence as perfectly objective, strictly factual and "true." It is not. It reflects whatever biases are progremmed into it (and it has some, deliberately or not). You can barely swap out human for human without changing the definition of "good" writing; you certainly can't swap out human for bot without blowing up the definition entirely. 

There are a hundred bad assumptions and built-in problems with AI in education. But we have to include the way proponents ignore the effect AI will have on how folks interact with the school. Parents will not treat your AI slop letter the same way they will treat a human note. Students will not complete assignments for the AI the same way they would for a human. Taking the human out of human interaction matters, and the people who don't admit it are just too busy trying to sell some education-flavored slop.

Sunday, April 19, 2026

ICYMI: Sumter Edition (4/19)

This past week we sailed past the 165th anniversary of the bombardment of Fort Sumter, yet one more example of a piddly thing tipping a country over into major problems. By coincidence, I was reading Erik Larson's The Demon of Unrest, which covers the period between Lincoln's election and the attack on the fort. It's a good read, and like all of his books, does its history homework even as it reads like a novel. There were many striking things to note, not the least of which is once again the degree to which enslavers really thought they were the good guys, and weren't just angry about having their economic system threatened, but were really butthurt about being treated like they were in the wrong. They couldn't have been more upset if some Northerner had called them deplorables. It's also striking that the 19th century outrage machine, primitive though it was, performed the very modern trick of getting the South upset over their certainty that Lincoln, if elected, would outlaw slavery on day one. And of course, the belief that all men are not, in fact, created equal; some are more entitled to power and privilege than others and that's how a proper country should be run. That's a through line for many folks in American history, and the guiding principle of the current regime (along with the related idea that the Betters should not have to take care of the Lessers). 

Good book. Worth a read (particularly if you are someone who has to teach Mary Chestnut diary entries). 

Here's the week's list.

Ten Commandments law is ‘distortion,’ ‘appropriation’

Rabbi PJ Schwartz calls out Alabama's new Ten Commandments In The Classroom law as distortion, appropriation, and just not right. (Also, that whole "Judeo-Christian tradition" thing is baloney, too.)

Let’s Pay Teachers Overtime

Nancy Flanagan reflects on the timeless question recently asked-- again-- by EdWeek.

Missouri's Parental Rights Bills: Treating Children as Property

Bruce Lesley examines a Missouri version of the Parents Rights bill that strips children of rights and protections and instead views them as property like a toaster or a couch.

RIP Khanmigo & Edtech Industry Dreams of AI Tutors

Matt Barnum's piece about Sal Khan sure unlocked a lot of feelings. Dan Meyer here comes to bury Khanmigo, not to praise it. Really.

Sal Khan’s Coming for Higher Ed

Is it time to stop piling on Sal Khan. No, it is not, most especially because his newest bad idea is to replace higher education.

‘Whoa, What Are You Doing Here?’: Why This Professor Subs in K-12 Classrooms

EdWeek runs a piece from education professor Nathan Stevenson, who writes about doing what every single education professor ought to do-- get into an actual K-12 classroom.

Highlands County faces the harsh realities of public funding across the state being diverted to religious and private schools

Florida continues to gut its public schools to fund the private, and it is making real problems for the public system. Eileen Kelley reports for WGCU.

Sorry, Stephen Miller: Immigrant kids have a right to an education, too

Raul Reyes at The Hill argues that Miller is wrong and cruel for his crusade against educating immigrant children.

Doctors and education experts who studied AI’s impact on the young call for a 5-year moratorium in schools

A call to just hold our cyber-horses. Including friend of the institute Leonie Haimson.

EdChoice ESA voucher study does not back up its claims about misspending

12News has been digging deep into the Arizona taxpayer-funded voucher system, and choicers have been trying to defend the voucher scheme, but as Joe Dana points out, the EdChoice defense doesn't hold up.

'I feel like we were used': Some Moms for Liberty leaders resign, claiming group’s focus has shifted

Tampa Bay 28 has the completely unsurprising story of how many of the grass roots supporters of Moms for Liberty are feeling as if they were just used for props in a political game. 

Ohio taxpayers directly fund more private than public school districts

Ohio's GOP dreams of being the Florida of the Midwest. Stephen Dyer explains how the education funding system is completely upside down.

An illustrated guide to resisting "AI is inevitable" in education

Ben Riley offers this handy guide. With pictures! Just the thing for the next time some yahoo tells you that AI in schools is inevitable and "here to stay."

Trump Administration Persists in Multi-Pronged Attack on D.E. I. and Civil Rights

Jan Resseger tracks some of the many steps taken by the Trumpers to roll back civil rights protections for everyone except oppressed white males.

Reading the Tea Leaves

Jennifer Berkshire continues to collect the evidence that voters actually want to protect public education from the worst politicians.

AI-Powered Tractor Startup Burns Through a Quarter Billion Dollars, Fires All Employees in Epic Implosion

AI tractors were going to transform agriculture. Instead, they transformed a lot of money into dust. I don't suppose anyone is going to learn a lesson here.

Meanwhile, over at Forbes.com, I looked at a report about Wisconsin's trouble with teacher retention, and a Senate bill intended to undo federal vouchers. 

This song kicks hard, but I think it's extra impressive performed live.



Saturday, April 18, 2026

More Edu-AI-Robot Ideas

While nobody seems ready to jump on board with Melania Trump's silly robot teacher idea, some folks just can't help trying to tweak it a little. The results are reminder of just how little the AI Bot revolution has to offer schools. 

Over at the Fordham Institute blog, Dale Chu (who calls Trump's idea "dazzling," so you know we're off to a bad start) says it's hard to imagine schools using robot teachers "so long as human teachers are available." But what about some other stuff?

Maybe it would be "plausible" to use humanoid robots for jobs that are hard-to-staff, low-paying or "more transactional in nature." For example, "In most districts, school support staff (e.g., cafeteria workers, bus drivers, custodians, playground monitors) represent a substantial share of school employees."

So, a robot lunch lady? Lunch room monitor? Playground monitor? Good lord, how would that even work? Is it really an improvement for a child to get their lunch from a robot-powered vending machine? And if there is a robot that can pick up the subtle clues that shit is about to go down on the playground or at a cafeteria table, that robot could be put to work on far more sophisticated and important tasks. AI has not yet demonstrated any sort of ability to read the room. 

And bus driver??!! Seriously? Self-driving automobiles are still short on safety and dependability-- how much more complicated is the job of a school bus, with its large ungainly body and the need for multiple stops. I suppose we can replace a custodian with a Roomba, but I can't help feeling that, again, the work is a little complicated for a robot.

Plus-- and this matters a lot-- these jobs all represent another level of student support, another layer of adult humans that students get to interact with as they do the daily work of learning how to be fully human in the world.

"To be sure, a robot that can monitor a hallway, supervise a lunchroom, or assist with routine logistics may not be ideal," says Chu in a sentence that should end with "may not actually exist." But his argument is that robots are "arguably better than leaving those functions, understaffed, unsupported, or shifted onto already stretched teachers." Unspoken is the rest of his argument-- that placing inadequate robots in those roles is arguably better than spending the money necessary to attract and retain humans for these jobs. 

The odd thing about this article as that Chu knows this is all wrong, and he knows why:
At the same time, caution is warranted. Schools are social institutions that help shape norms, relationships, and a sense of community. The presence of adults in hallways, cafeterias, and playgrounds contributes to a culture of supervision, care, and belonging that cannot be easily replicated by machines. Replacing too many human roles risks eroding the very fabric that makes them work. 

And then this--

The same logic that makes robots attractive in moments of scarcity can, if applied too broadly, lead to a gradual hollowing out of human institutions. What begins as a practical response to labor shortages can, without discipline, evolve into a default preference for automation, even in contexts where human interaction matters most.

So maybe Chu is just constructing a very subtle straw man so that he can make a point about automating education:

However far the technology advances, the image of a robot instructor entering a classroom captures both the appeal and the limits of the idea. It is easy to imagine machines layered into the routines of schooling, especially where tasks are repetitive and predictable. It is far harder to imagine them displacing the relationships at the center of teaching and learning. The more likely future in education is not one in which robots replace humans, but one in which they remain peripheral tools—useful in discrete ways, but never central to how children are taught or how schools are run.

I'm not so sure it's easy to imagine for people who actually work in schools, but of course that's not the Fordham audience. Maybe Chu is just trying to sneak up on them. Maybe the audience is supposed to be alarmed by the future he posits in the beginning. But I am afraid too many people will only read the first half of this piece when it's the last few paragraphs that include the actual points worth listening to.  

 

Thursday, April 16, 2026

Schools' Unpaid Labor Pains

So this week, I was the asshat.

A local school district shared a digital poster advertising some job openings. I shared it with a glib comment on Facebook, and then other folks piled on. And then the person who made it joined in, and she was hurt. The poster was something she had put together over the weekend, on her own time. And she is someone who has been a professional colleague of my wife. So I know that she's a dedicated educator. So, yeah. I had forgotten Rule $1

I had also forgotten the rule about being clear, because I never meant to shame her. I meant to shame her district, because while I didn't know who exactly had created the poster, I was pretty sure I knew that it was a person who was not specifically hired to do that work, nor given the time, support or pay to do it. And I was correct. 

But isn't that the dynamic in education way too often. Some educator takes on a task they feel needs to be done. Administration provides no time to do the work, no serious compensation for the work, no support or resources for that work, and then somehow, when the results are less than perfect, everyone including the person who was trying to do the job, assumes that any critiques of the result should be and are directed at the person who did the work rather than the district that sent them out to try to get it done. The district tapes a pair of paper wings onto an educator's back, shoves her off the top of the building, and then folks gather around to critique her crumpled form. Or--almost worse--give her an award and a heartwarming news profile focusing on her heroic work flapping those paper wings. But nobody asks the district, "Why the hell were you making her try to fly off the building with paper wings??!!"

You can see it in every single discussion of every single school-related issue. 

We have umpty-zillion studies and anecdotes to tell us that time needed for administrative stuff is a big issue for teachers, a major stresser in the work. We dither and discuss as if the solution is some mystery. Restructure the profession? Or maybe magical AI assistants! But the obvious solution is to provide teachers with the amount of time required to do the amount of work they're given. 

EdWeek's 2022 survey found that the median number of hours per week worked by teachers was 54 ish. There isn't a school in the country that wouldn't grind to a halt if teachers worked to the contract-- in other words, only worked the hours for which they're actually paid.

And there's the other extra unofficial duties that teachers take on. Every building has lead teachers, whether they are titled and paid or not. There are the people in the building who coordinate things like cards for birthdays and getting well and condolences. There are the people who are the unofficial It helpers. For years I was The Guy Who Runs Graduation Rehearsal and Ceremony, and I had the job because I inherited it from the last person who had it. It wasn't official or compensated, but there I was because somebody had to do it, and I didn't mind because I loved my school and the traditions connected to it. 

And that's all on top of the extra classroom work itself. Special Ed teachers and their mountains of paperwork. English teachers up till late because they have a stack of papers to grade. The most fundamental hard part of teaching is that you do not have enough paid time to get the job done; to do the job even sort of the way you know it needs to be done, you will have to volunteer hours.

Schools function on all those volunteered hours. It keeps things cheap, and administrators like it because it takes one more flaming possum off their desk. But if you are one of the people providing this unpaid labor, you really need to be motivated by your love for the school because otherwise you'll start to wonder about how unimportant this job must be if nobody is providing resources or support to get it done.

The digital era has provided more new examples. Your district should have an active and professional online and social media presence-- and the district should be both paying someone and also proving that person with hours in which to do the job. Some staff member shouldn't be left to do it for free during lunchtime or on weekend afternoons. (Odds are that the charters, cyber charters, and voucher-accepting schools you're competing with have hired someone to do the job.)

What I should have posted was something like "Well, this is a good try, but why hasn't the district hired someone to handle this kind of work?" We should be yea-anding these discussions. "Yes, Mrs. McTeachlady did a fine job on that project, and what is the district going to do to provide her with resources, time, and support to do it next time?" God bless the people who try, and shame on the people who let them struggle on their own .

The answer to all of this will always be A) money and B) convenience. "We could do this the right way, but it would be hard, and cost money!" Meanwhile, in the background, we have the usual chorus of folks saying, "Well we spend more and more on education, and yet test scores don't go up, so we should spend no more." I have a response for all these people, but I have already broken Rule #1 once this week, so it will have to wait. 

Wednesday, April 15, 2026

AZ: Charter Shenanigans From Primavera

Meet Damian Creamer. His LinkedIn profile lists him as a "courageous innovator" and "future-ready leader," and if nothing else, he seems to have mastered the innovation of becoming a future-ready profiteer in the charter biz.

Creamer graduated from Brigham Young with a BA in Spanish back in 1995. There's a six-year gap in his LinkedIn CV, but in 2001 he landed on a pair of big ideas, launching Primavera Online School and Strongmind (which for some of us may have unfortunate echoes of classic Homestar Runner). 

Primavera is the cyber charter; Strongmind is the company that provides the actual education stuff. Primavera, owned by Creamer, collects that tuition payments for the students, then pays Strongmind, owned by Creamer (who is apparently the only shareholder), to provide the instructional programs. This is what we call a "related party transaction," and Arizona is a wild west playground for such shenanigans. A 2017 report from the Grand Canyon Institute found that 77% of Arizona's charter schools operate with related party transactions. It's supposed to be "efficient" and if your goal is to efficiently enrich the folks running these operations, well, then, sure. Maybe you imagine guys like Creamer having tough negotiation sessions with himself ("I'm not going to buy this service from me unless I can offer myself a 20% cut in costs").

A decade after launching his con-joined business children, Creamer did go back to school to the Thunderbird School of Global Management (ASU) and Harvard Business School Executive Education. 

Grand Canyon Institute gave Creamer his own report last year, explaining just how complicated this self-enrichment shell game can become.

Damian Creamer. an entrepreneur, oversees and profits from multiple entities, Primavera online charter school, a.k.a. American Virtual LLC, and operates under the for profit Management Group American Virtual, LLC, StrongMind, the software entity that he contracts with, and Verona Learning Partnership which was built on the nonprofit assets built by Primavera and now has Valor Preparatory Academy charter school under its umbrella. StrongMind even has another LLC in the Philippines.

Creamer also gets to play educational expert. Here he is opining about "learning science" with some great argle bargle like 

At the heart of this is StrongMind Intelligence, a foundational infrastructure layer that Creamer and his team are developing to enable learning systems to operate in accordance with learning science. “StrongMind Intelligence is not a feature. It is not a chatbot,” he explains. “It is the intelligence layer that models the learner continuously and supports real-time adaptation.”

Or this glowing interview on IdeaMensch with the "visionary education entrepreneur" prasising how "His work is grounded in the belief that autonomy, competence, and connection are essential psychological nutrients for learning, and that technology—when designed ethically and intentionally—can amplify, not replace, human impact."

In several interviews Creamer talks about how he makes all important decisions by 2 PM. He is just an action guy-- this next bit turns up in more than one Creamer interview:

Ideas are easy. Execution is everything. I bring ideas to life by pressure-testing them early and grounding them in purpose. If an idea doesn’t clearly improve learning, empower people, or move the mission forward, it doesn’t make the cut. Once the “why” is solid, I focus on the simplest possible version that can create real momentum. From there, it’s about getting the right people in the room—product, engineering, UI/UX, learning, marketing, operations—and creating shared ownership. The best ideas get better when they’re challenged.
I like to move quickly, but not recklessly. I believe in shipping, learning, and iterating. Progress beats perfection every time. We launch, we listen, we adjust and we keep moving.

 He is just loaded with tech bro bromides. He likes to stay "ruthless about priorities and intentional about focus." He stays productive "when I'm in flow and designing my environment to support it." His early career failure? "Not trusting myself to make decisions." He screwed up by listening to other people. He likes to listen to Joe Rogan.

And he's right there with the AI revolution. ChatGPT is his "thinking partner," and he's counting on agentic AI to help keep his learning software focused and adaptive. There is so much more, although none of it is about the actual nuts and bolts of education. Which is a real choice, given that cyber charters perform so poorly that even charter fans like the Fordham Institution scold them.

The nuts and bolts of making money, however, are well addressed by Creamer. According to a 2025 report from Grand Canyon Institute by Dave Wells and Curtis Cardine. Some of their findings are stunning.

Arizona charter schools get a whopping 85% to 95% of the funding that brick and mortar charters get, and yet cyber charters, including the Arizona variety, get much worse results than brick and mortar schools. Primavera was great at hanging onto money; by 2015, they had accumulated over $45 million in assets. 

At the same time, Primavera was spending millions on curriculum, software and support purchased from Strongmind. Creamer has been passing a lot of money between his left and right hands.

Transitioning from non-profit to for-profit involved all sorts of new entities that appear to be simply Creamer putting on a variety of party masks. In the meantime, Grand Canyon computes that Creamer accumulated over $75 million in profits.

Meanwhile, the investigative team with Craig Harris at 12News found other ways that Creamer has been raking in the money. They found that 78% of all taxpayer dollars that went to Primavera went to "management." Meanwhile, there's a big fat "stockholder equity fund" with around $10 million parked in it-- that fund benefits "exactly one person: Damian Creamer." Since 2017, 12News computes that Cramer has paid himself $24 million. He did, however, managed to give some hefty gifts to Congressman Andy Biggs and State Senate President Warren Peterson, who have been vocal defenders of Primavera. 

But, hey-- if Creamer can actually deliver on all his talk about brilliantly leading to awesome conclusions, maybe he's worth all that money. So, is Primavera accomplishing great educational achievements with its students?

No. No, it is not.

Primavera is in the news these days because State Superintendent Tom Horne just rescued Creamer from the Arizona State Charter Board, which was about to shut Primavera down based on three straight years of a D grade. Creamer started laying folks off in anticipation of the impending charter revocation. Not that Creamer didn't have a Plan B-- Primavera can just switch to a private school and cash in on Arzona's taxpayer-funded school voucher program. They'll even give out free laptops, and no state tests to take, either!

The state board started the process of shutting Primavera down in March of 2025, and they were just about there when Creamer got Horne to back his "administrative error" argument. Which said that the school should have been judged as an "alternative school" rather than a "traditional" one. The school had been rated as an alternative school for many (but not all) years since the designation was created in 2012. Creamer says he made a mistake and didn't catch it because he was busy caring for his wife, and also COVID. 

I am wondering why Arizona has different standards for "alternative schools." Arizona defines such schools as those that "serve specific populations of at-risk students" or a student who "is unable to profit from a traditional school setting" and I guess the idea here is that alternative schools are expected to fail, so we'll lower the bar for them so that their failure is less obvious on paper. But if alternative schools are designed to create success in ways that traditional schools don't, that seems different than just saying "this is a school for students who probably won't succeed."

"We have the students who are already in academic trouble" is the common refrain of cyber charters, and it's a legitimate observation (though arguably less so since COVID drove some traditionally successful students into cyber settings), but that's the business you're in, so shouldn't you be better at it? Primavera has been at this for twenty-five years-- shouldn't they have figured out how to do better than the same failure rate that a traditional public school would experience with these students? Especially as I am led to believe, by cyber charter teachers who would rather not go on the record, that cybers work hard to cook the books for that special smell of success.

Oh well. Horne has retroactively declared Primavera an alternative school for the years its performance was subpar, and now that same performance is hunky dory. The state charter board is pissed, but Creamer is still rich and well-connected. Just another day in Arizona. 

Monday, April 13, 2026

AFT Shares Bad AI Advice

AFT made plenty of folks sad when they decided to jump on the AI bandwagon last year (kind of reminds me of the days they were resolutely on the wrong side of Common Core). They haven't shown any signs of slowing their enthusiasm, with items like this puff piece from the AFT site about "Harnessing the nest of AI," which is right up there with "Embracing the advantages of cholera." 

In this piece, three teachers share some of their "tips for saving time and boosting creativity" with AI, and oh boy. 

The three teachers have 23, 6 and 30 years of experience. They teach K, 4th and 5th grade special ed, and math. I'm not going to call them out by name because they are out there trying to do the work, and teachers take enough crap for doing that, anyway. But I do want to highlight some of these highly dubious ideas.

LV regularly uses "ChatGPT to create curriculum and lessons, as well as to differentiate lessons." And while "uses" can mean many things here, I cannot say hard enough that there is no chatbot that knows more about content or instruction than a teacher does. None. Because a chatbot doesn't "know" anything. It doesn't understand the content, doesn't know what would be the best instructional approach, doesn't know anything about bridging information and young human brains. Lesson planning is the perfect time to conceptualize chatbots like this-- If you ask it to write a lesson plan about the Civil War, it processes that as "what would a Civil War lesson plan look like?" It can create something that looks like a sort of average of every lesson plan that is has been trained on, but it knows nothing about good or bad instructional design, and it is making up everything it says about the content (some of it will be accurate, some of it won't, but all of it is made up). 

LV also uses Google's GEM tool, which is basically a tool that lets you set up your own chatbot which only accesses what you have fed it. LV is the math teacher, and I think that matters a lot here. They say the GEM can be "limited" to provide hints for the next step rather than the answer, which strikes me as more functional for solving an equation than exploring themes in Song of Solomon.

CS uses AI for small tasks, including differentiation, building rubrics, and to "refine the wording" on IEPs. And to make substitute plans. Differentiation comes up a lot, and I can see the appeal, but the time it takes to really, precisely prompt the bot strikes me as canceling out the time saved. If you are letting the bot determine what the differentiation should be, that's malpractice.

EL is in a school district that adopted Copilot as its "AI platform," which is its own kind of dopey idea. But this teacher thinks it's cool for creating games and scavenger hunts. And this comment --

As a veteran teacher, it’s easy to teach the same thing over and over again; AI is helping me get outside of my comfort zone and do some different things with the kids.

If you only get outside your comfort zone because some bot has made it super easy, have you really gotten outside of your comfort zone. I try not to be too judgy (just judgy enough) but I really do judge people who can't even work up the ambition  to scroll down the search engine page past the often-wrong AI results to see search results. Is Googling for lesson ideas (which was never a great approach) really too hard for some folks now? 

But that's not as alarming as this quote about a colleague who is "not very good with technology" but just "bloomed" with ChatGPT:

He became interested in having it generate passages for his students to read and then started adding topics that they like, such as dinosaurs and Power Rangers. Now these AI passages are a reward in his classroom—when kids complete their work, they can ask for a personalized passage. One child asked for a story about playing soccer with the Argentine star Lionel Messi.

Just stop it. Stop. It. Is he checking every one of these for accuracy? Because I'm betting a teacher who doesn't have the time to hunt down real pieces of writing by live human authors also doesn't have the time to make sure some child isn't getting a "bonus" reading about how dinosaurs used to help cavemen work at Mr. Slate's gravel pit. There are so many bad messages here, leading with the devaluing of human writing. Just stop!

LV uses a GEM to help students edit an end-or-semester project, and that has led to a concern that students "may start to optimize their writing to please AI instead of writing for a human reader." You think? Of course they will. LV has a solution-- "I’m trying to instruct the Gems to give objective, rubric-based feedback without altering the students’ voice, tone, or style. I want AI to support their thinking and not reshape their writing." Good luck with that, given that the bot is not well-equipped to identify voice, tone, or style, let alone preserve it.

And if it seems as if writing the instructions for a GEM would be rather intricate and time-consuming, well, can you guess how LV solved that? But having ChatGPT do it. 

All three use AI to communicate with parents, and some of what they have to say is surreal, like EL explaining that when they're tired and frustrated, the bot "helps me send notes to families about behavior challenges that are clear and kind." I don't know what to do with the notion that a human needs a bot to help them be more human, but I do not how I would feel as a parent if I were getting notes from a bot instead of my child's actual teacher (the answer is "pissy.") EL also uses AI to generate activities for families to use at home. 

The "interviewer" does ask the three if they have concerns. EL is concerned that middle and high school students use it too much, to the detriment of their thinking, but she doesn't have those concerns as a kindergarten teacher, and I am wondering if her students see her having AI do parts of her job for her, because that might matter. And she's sure that AI won't replace her, because AI can't hug five-year-olds or meet their social and emotional needs. Sigh. First, if you think that's the only thing AI can't do, you need to rethink what you bring to the job. Second, if that's all you do that AI can't, you can in fact be replaced with an AI augmented with a minimum-wage aid who handles the hugging and social-emotional stuff.

LV correctly notes that AI is actually worse than plagiarism for students because AI can do the whole job without students even glancing at the work. AI can oversimplify and push formulaic patterns. "Students miss out not only on building knowledge but also on developing curiosity and their voice." LV makes students do handwritten assignments twice a month for an "authentic picture."

CS notes concerns about environmental impact of AI and replacing human joy, like art. CS calls these "on a personal note" and I am wondering why that's not a professional note. Fears about student learning are valid "But the more I use it, the more I realize that if educators don’t know how to use it, then we can’t help our students learn to use it responsibly." I am imagining a high school coach explaining, "Yes, I use steroids, because how else can I help my students learn how to use steroids responsibly."

And language like this really concerns me:

My last thought for my fellow educators is that getting started with AI is a lot like having a conversation with a new colleague. You introduce yourself and your goals, and it provides suggestions—sometimes good, sometimes bad. But unlike a colleague, it has no feelings, so I can say plainly that I like one section of a lesson plan but not another. Plus, it works instantly; I can provide a critique and get a revision immediately. The key for me has been treating AI as a partner in the creative and planning process, not a replacement for my judgment.

No no no. AI is not a partner, not a colleague, not a thing you can have a conversation with. It's a tool.

Look, I get the need for finding more time? I really do. One of my most widely-read pieces was about exactly that. But I am suspicious of AI times savings, given the amount of time needed to craft a prompt and run the result through multiple revisions, I'm unconvinced. 

Nor am I convinced that getting into a do-as-I-say-not-as-I-do situation with students will end well. "Today, I had AI generate a lesson plan so that you can learn to not use AI to just do your work," is going to be a hard sell. And this line from CS--

As professionals, we use AI to save time and enhance our work—but we’re still doing the thinking and using a mix of resources. Too many students are using AI to think and do their work for them.

You can tell yourself that, but I'm not so sure. Yes, AI is worse for someone with no background of knowledge at all, but how many teaching muscles are you not using when you use AI to take care of all these various functions? Maybe you're still doing some of the thinking, but you definitely aren't doing as much as when you hammered out lesson plans by yourself. 

There are some worthwhile cautions folded in this AFT puff piece. There are plenty of professional conversations to be had (with other humans) about these issues, but when they come wrapped in big-tech-financed AFT packaging, they aren't a conversation-- they're an advertisement, and an advertisement designed to swoop us right past the whole Should We Do This At All question. I expect better from teh second-largest teacher union in the country. 

 

Sunday, April 12, 2026

Dollars And Cents And AI And Sense

A few weeks back, I wrote a piece for the Bucks County Beacon in which I suggested some questions that you should ask your local district when they start making noises about incorporating AI into your district's schools. But I realized afterwards that I left a big question out, so I'd like to amend that earlier piece right here. 

How committed is your district to paying the actual price?

This hit me in the midst of one of those on line conversations in which a journalist tried to explain that she doesn't use AI for, you know, the important parts of the writing, but just for things like research and fact-checking and proof-reading and editing. I suggested that this seemed like a bad idea, that AI was not particularly good at any of those things, and then I heard, from many posters, a new counter-argument.

I'm thinking of 2023 AI. The new, advanced super-duper bots are so much better. I needed to get my head out of the old, free bots.

The AI that just anyone can use, they seemed to be admitting, is inadequate. You have to step up to get the good stuff.

Now, I'm inclined to disbelieve assertions that newer, better AI can do human thinky stuff. But let's pretend the newer better AI is really newer and better in ways that matter. 

This is, of course, a well-established computer tech model. You can have the free version, but it's, you know, broken. Here's a cool new app that will only work, sort of, for the 90 minute trial period. Here's a game that is really an ad delivery system. Here's photo editing software with no features. 

It's a relatively new invention, this model. It used to be unimaginable that a dealership would sell you a new car that had some cool features that are broken until you pay extra to unlock them. Imagine buying a house and then discovering that none of the doors actually work (unless you hire some carpenters to come in and fix them). 

Or, in a school setting, imagine buying a new set of textbooks, then discovering upon delivery that they are all missing several chapters, which you can purchase from the publisher for an extra fee. 

So here's what you need to know from your district. When the super-cool features that sold your superintendent or tech procurement committee on this AI whiz-bangery in the first place, is your district committed to paying the new fees. When the teachers who are supposed to actually use this AI tool discover that real utility comes with an extra cost, will the district cough up the money? Or is the district's expectation that teachers will somehow make use of a piece of broken software?

Or will wealthy districts get the fully unlocked programs, while poorer districts will have to limp along with the demo model? 

And when this year's model is supplanted by next year's hot new thing, will the district be committing to throwing more money at it? And what other funding will they cut in the district to get the money to feed their new AI habit? Because once FOMO gets in your blood, it's hard to kick the habit, and you can bet that vendors will keep right on warning that schools dare not get left behind by the newest inevitable shiny thing of tomorrow.

So that's the other question to ask your district when they start gazing longingly at AI-- just how far are they willing to go, and do they intend to keep shoveling money into the program, or will they ask teachers to get on the cutting edge with the broken version of last-year's already-cooling-off Hot New Thing.

Mind you, that's not the only question to ask (there are more here), but you cannot get a real answer to "What do we expect to actually get out of this, and is it worth the cost" if you don't take an honest look at the cost. Because whatever your district thinks the cost is, it's way more than that. 

ICYMI: Spring Arrives Edition (4/12)

Spring does not officially arrive in Northwest Pennsylvania until we've had at least one snow after Easter, and this year that milestone arrived quickly. So now we're into the days of Spring, when one needs a coat in the morning and shorts in the afternoon and an umbrella and mud shoes all the time. Not my favorite season, but it has its charms. 

Here's some reading for the week. If you do not do so already, consider subscribing to some of these folks. 

Inside the Latest MAGA Attack on Undocumented Children in Public Schools

Josh Cowen takes a look at Stephen Miller and his targeting immigrant children as a way to punish them and their parents, because Stephen Miller just does not want those brown people around here. What a miserable man.

Old Dog, Old Tricks

Teacher Kate Roberts with a wonderfully eloquent argument for remembering and humanity.

If Astronauts Can Attend Public Schools. . .

Dear Bubbie reminds us about the connections between astronauts and our public schools, and the threats to those schools (particularly in Florida).

DeSantis signs Florida law to label groups as terrorists and expel student supporters

You may remember when Florida tried to declare the Council on American-Islamic Relations and the Muslim Brotherhood as terrorist organizations. Then a federal judge told them to knock it off. So now DeSantis and company have passed as new law that lets them call anyone a terrorist supporter they want to, and throw students out of the state. The AP has the story.

When a teacher ditched screens, class got harder. That may be why it worked.

Bookmark this piece by Matt Barnum at Chalkbeat. A teacher got rid of his computer assistance, and it made his job harder--but it worked better. Almost like speed, efficiency and ease are not critical needs for educational achievement.

Schools across America are quietly admitting that screens in classrooms made students worse off and are reversing years of tech-first policies

Marco Quiroz-Gutierrez at Fortune with the story of ed tech regret.

It’s Not about Cheating

Nancy Flanagan explains-- it's not about the cheating, but about the learning.

Primavera Online Charter School avoids shutdown for abysmal grades after State Superintendent Tom Horne steps in for multi-millionaire owner

The fairy tale that free market forces will provide accountability and excellence in the school choice world takes yet another hit. Turns out if you are a billionaire donor in Arizona, you can get an official to run interference for your crappy cyber-charter. Craig Harris at 12News continues to do exceptional work.

The Federal Voucher Program Is a Costly Illusion

Denise Forte at EdTrust explains why the federal voucher program is a snare and a delusion. Share this with your friend who keeps asking about the free federal money.

Legislators Imagine that Teaching the “Success Sequence” in Schools Will Stamp Out Poverty

Some legislators just can't fall out of love with the Success Sequence (aka "if you're poor it's because you made bad choices") and in Ohio, they'd like to make it mandatory teaching in public schools. Jan Resseger explains why that's not such a great idea.

Earlier ADHD diagnosis linked to better education

Not sure there's a big surprise here, but this study from Finland is worth noting. Johnathan Kantrowitz explains what they found.

Robert Sweet’s Early Influence on The Science of Reading

Nancy Bailey with a valuable explainer of one of the early influencers on the "science of reading." Along with a whole bunch of other folks who weren't reading teachers, either.

The Mississippi Reading Reform Multiverse (And Lessons Ignored)

Paul Thomas responds to yet another attempt to lionize the Mississippi not-exactly-a-miracle.

And I Would Have Gotten Away With It Too If It Weren't For Those Pesky Kids

Audrey Watters looks at the Matt Barnum piece about Sal Khan and his failed revolution.

I Don’t Want to Be Teacher of the Year

Matt Brady on why some of the folks doing the best work are not going to be winning the awards.


Thomas Ultican on science, edtech, rich amateurs, and the freedom to teach. 

Scientists invented a fake disease. AI told people it was real

A case study in how swiftly and easily AI can pollute the information ecosystem. Christ Stokel-Walker writing for Nature. 

This week at Forbes.com, I wrote about a Louisiana court case that ended up okaying a charter school's power to discriminate against students with special needs

I am not generally a fan of folks showing off their kids on youtube, but this classic is just so sweet, and the father so centered on the child. They had just watched fireworks, the story goes, which is why she keeps stopping, just in case. And this song was built for ukelele. 



Subscribing to my newsletter is free, and always will be.