Saturday, September 24, 2022

Another Plug For The Robot Teaching Future

"Can robots fill the teacher shortage?" This is what some corners of the industry are thinking. It's dumb, but it's out there.

That dumb question headline ran in Digital Future Daily, a "newsletter" under the Politico banner that is "resented by" CITA-The Wireless Association, an industry group that "represents the U.S. wireless communications industry and the companies throughout the mobile ecosystem." The piece is written by Ryan Heath, who is a Politico staffer, so we are somewhere in the grey mystery area of newsvertisement (which we're in far more often than we realize, but that's a topic for another day).

The piece is essentially an interview with Cynthia Breazeal, who recently became dean for digital learning at MIT, where she's been with the media lab since getting her PhD there about twenty years ago. Breazeal has a long-time interest in personal aid robots

She's also an entrepreneur and co-founder of jibo. Jibo started out as an Indiegogo project in 2014. The first home social robot was supposed to release in 2015, then 2016. It was finally released in 2017 by MIT. Even though Time profiled it as one of the best inventions of the year, things did not go well. Less powerful and more expensive than Alexa or Google Home. Software kit for developers never released. Jibo was released in November, and in December layoffs began at the company. Wired wrote a thorough eulogy for the little robot in March of 2019. 

In 2019 the disruption division of NTT (Nippon Telephone and Telegraph) bought up the scraps of the failed company and product and has been "working on preparing jibo for an enterprise scenario mainly focusing on healthcare and education" but only as a business to business product, including applications for education,. They proclaim that Jibo provides the "perfect combination of intelligence, character, and soul." Sure. 

None of this comes up in the Politico piece, which presents Breazeal as an academic and "evangelist" for AI who has come to the UN to pitch educational robots. In particular, the piece notes the international needs of students dealing with covid, students with special needs and "displaced children" (aka refugees). This is the second time in a week I've come across someone plugging an ed tech solution for these children, and I get the interest in what is, unfortunately, an emerging market--when you have students who have been ripped away from their home country, how do you give them some continuity of education while they live in a place where their usual educational system does not reach?

The UN expects a global teacher shortage of 70 million by the end of the decade. But folks have concerns about AI and "socially assistive robots." Given AI's previously demonstrated ability to turn racist on top of the fact that these constructs are not actual humans, those concerns seem appropriate. 

How do you test AI with children? “We actually teach the teacher and the parents enough about AI, that it's not this scary thing,” Breazeal said of plans for a pilot project in pro-refugee Clarkston, Georgia — the “Ellis Island of the south.”

“We want to be super clear on what the role is of the robot versus the community, of which this robot is a part of. That's part of the ethical design thinking,” Breazeal continued, “we don't want to have the robot overstep its responsibilities. All of our data that we collect is protected and encrypted.”

How do parents and teachers react to the role of a robot in their children’s lives? “It's not about replacing people at all, but augmenting human networks,” Breazeal said, “This is not about a robot tutor, where teachers feel like competing against the robot,” she said.

Breazeal said the children she’s studied are “not confusing these robots with a dog or a person, they see it as its own kind of entity,” almost “like a Disney sidekick that plays games with you, as a peer.

How, exactly, does a robot enhance human networks without replacing humans? If the robot is not an actual teacher or even a tutor, then exactly what role is it filling, and how, exactly, is it filing that.?

Granted that Breazeal is at the mercy of Heath's editing of their conversation, but it's striking how much of her pitch focuses on what the robots won't do, rather than things like, say, what they are actually capable of doing, or how she plans to solve the issue of what teaching materials the robot can manage and where those materials will come from. And before I let Breazeal handle any of this sort of thing, I'd want to hear from her what she learned her in her previous attempt to launch such a business--the attempt that failed, but she has apparently remained pretty quiet about the whole chapter.

Like the vast majority of ed tech stories, this one is aspirational--dreaming as marketing, prognostication as advertising. It sure would be great if more ed tech was about things we can actually do to help educate children and less about fantasies about products we might be able to move, someday, if we can just convince people they're inevitable. 

2 comments:

  1. People are communal animals, pack animals. They have been that for hundreds of thousands of year. Learning from other people is part of their DNA, and this jerk from MIT appears to be unaware of that fact. People learn from other people much better than they do from machines.

    ReplyDelete