Friday, September 12, 2025

Narcissus and AI

In Adam Becker's must read book about our AI overlords More Everything Forever, one chapter opens with futurist Ray Kurzweil's plan to resurrect his father. 50 boxes of his father's possessions, his letters and music. AI will send some nanobots to extract DNA from the grave. Nanobots will extract memories from Kurzweil himself. AI will put it all together and a program will reproduce the father's behavior, even in situations that he never encountered in his life.

"Ultimately it will be so realistic it will be like talking to my father," Kurzweil claims. "You can certainly argue that, philosophically, that is not your father, but I can actually make a strong case that it would be more like my father than my father would be, were he to live."

So much yikes. But my first thought was that, maybe--maybe-- it will seem to you like your father was there, but it certainly won't seem like that to him.

AI "resurrection" is alarmingly commonplace, to the point that it only attracts attention when it crosses a new threshold of eww, as when Jim Acosta interviewed an AI construct of Joaquin Oliver, a student killed almost eight years ago in the Parkland school shootings. The interview happened with parental permission, I guess partly because it helps promote their gun control advocacy, but also, as the father said, so he and his wife could hear their son's voice again. Which is different from giving him the chance to speak again.

AI avatars of real people are disturbing. Schoolai caused a stir by unleashing an AI avatar of Anne Frank for classrooms as just one of their offerings of zombie historical figures for the classroom. In fact, there are now more outfits offering AI avatars for student use than I can even delve into here. Some are especially terrible; Wisdom of the Ages lets you chat (text only) with some big names of history, and within the first sentence, the Einstein avatar was talking about "he" rather than "I." Their "Adolph Hitler" also lapsed quickly into third person. Humy offers a Hello History app that promises all sorts of "engaging historical simulations" and an "in-depth and personal interaction with the historical figure of your choice." And don't forget the company that offers you the chance to take a writing class taught by a dead author.

There are numerous problems here, not the least of which is simple accuracy. One historian noted that the Anne Frank avatar was reluctant to say anything mean about Nazis. Imagine if PragerU trained its own set of historical avatars, giving students the chance to see and hear a realistic simulacrum of a colonial enslaved person explaining why they actually kind of enjoyed being enslaved. 

Historical simulations are nothing new, from movies to that person who dresses up as Lincoln and visits your third grade class. But those simulations come with a built-in distance. It's just a movie, and nobody thinks that guy with the fake beard is really Lincoln come to life. But AI avatars promise to be easily mistaken for the real thing.

The idea of using AI to resurrect dead loved ones really brings home the inadequacy of this whole exercise. 

The premise of Kurzweil's resurrected father and the Olivers' resurrected son is that they know enough about their lost family members that they can faithfully and fully reconstruct them. I have my doubts. With a famous historical figure, maybe the many scholars who have pored objectively over that person's life have unearthed enough information that we could reconstruct a fully detailed and nuanced portrait of the person. Maybe, but I doubt it.

But I double doubt that for ordinary people. I've known my parents and my children for a long time. Am I arrogant enough to imagine that I know them so well, so completely, that I could perfectly reconstruct them? 

No, what I know about them is my own impressions, my own feelings, my own memories of my own perceptions of them. But that's as much about me as it is about them. 

There is, of course, a whole industry set up to let you "resurrect" your loved one. It's creepy. And it does not give the departed another chance to talk to you-- it only gives you another chance to talk to them. Except it doesn't really do that because they are not there. The AI does not bring them back; it takes your own memories and impressions and pushes them into a screen.

Chatting with a bot is playing ping pong with yourself. The software extrudes a probable string of words, but you do all the work of injecting meaning into them. 

When you face an AI avatar for a famous person, you are likely facing a mask that has been slipped over someone's software expression of their own particular agenda wrapped around an incomplete and shallow imitation of a real human waiting for you to respond by giving that silicon golem meaning. But when you use the technology to create an avatar built out of your own incomplete memories, you are simply talking to yourself. You have not given that person another life; you have only given yourself another way to imagine they are still here.

None of this is the same as talking with another living human who is actively trying to convey meaning and intent to you. In real life, projecting your own ideas into another's words gets in the way of actual communication, of actually reaching to understand. In the world of chatbots, your projection is necessary for the "conversation" to continue; you have to take care of both sides.

Narcissus gives us "narcissism," currently on the list of Top 5 Favorite Amateur Diagnoses. But the story of Narcissus was of a person who sat by a pool of water, gazing at his own reflection and imagining it was another person, until he eventually melted away. We would do better to try to hear and see and understand the live human beings who are still around us than to sit down by the silicon pool, gazing into a reflection that we imagine is another actual human. 


No comments:

Post a Comment