Tuesday, July 2, 2024

AI and Disengaging Reality

I'm fully aware that I'm going to sound like an old fart here, but I think much of what AI is promising to deliver is not just an advance in technology, but a lurch in the wrong direction and fundamentally bad in ways that all other technology up until this point was not.


For virtually all of human history, technology has helped us extend our reach, sometimes in incredibly powerful ways. Reading and writing allowed us to listen to the insights and ideas of people separated by time and space. The printed word increased that miracle by a thousand-fold. 

Centuries of advance in various media have extended our human reach. Just think of the ocean.

When my students would complain about the verbosity of authors from an earlier time, I would point out the limitations of the age. Back then, I'd say, the only way to see the ocean was to physically travel to the ocean and look at it with your own eyeballs. Maybe you could see a painting of it, though you could only see the painting if you were physically standing in front of it. So an author who wanted to evoke the ocean would have to do a lot of work to create that picture in the minds of readers who really didn't have much on which to draw. 

Then there came photographs. Then print, so that you could see copies of photographs and paintings. Then movies. Then photographs and movies in color. Then television. The color television, on which to watch the movies of the ocean, or maybe even live broadcasts from the ocean itself. Then the internet and the capability to share depictions or even live feeds of the ocean on a device you carry in your pocket, any time you wish.

The march of media technology has brought the ocean closer and closer to every human being.

Ditto for areas of human knowledge. Go back far enough and you're in an era in which the only way to learn some piece of information is by talking to a person who already knows it. Then writing made it possible for many people to get that piece of information, even if the person who originally possessed that information is currently deceased. The printing meant an incalculably large number of persons could reach out and grab that information, and digitized technology increased the number exponentially. 

Even the internet, for all the demonization and pearl-clutching about Kids These Days tied to their screens, has made it easier to connect with other human beings. Left-handed basket-weaving afficionados can now find each other and share ideas. My daughter and her family are on the other side of the country, and while they can only travel back here a couple of times a year, my grandchildren are growing up knowing my face and voice. 

The long march of media technology is toward increasing engagement, making it easier and easier to find and grasp and grapple with ideas and people and the world.

Media tech has steadily built bridges between individual human beings and the larger world around them. But AI promises something else.

AI builds a wall around the world, then sits on top of the wall and promises to tell us what it sees, more or less, kind of, with maybe some extra made up stuff thrown in.

This is not always a bad thing. If I want to know how many sheep are in my yard, I could go out and count them myself, or I could ask software to count them then report the number back to me. Useful.

But other wall building work is more troubling. I deeply love that for any question that occurs to me, I can google a variety of sources, look through them, learn about the answers to the question. Or I could skip actually engaging with the sources and just let the terrible AI from Google (or Microsoft or whoever) give me a quick summary of whatever it has scraped off the interwebs, more or less, with right, wrong and fictional all dumped into one big stew together.

Or like the ad that promises I can use AI to write up the notes from the meeting. I don't really need to pay attention. In fact, I don't even need to check in at all. Just let the AI monitor the meeting and then get back to me with the notes it compiles. No need for me to engage on my own. 

Or the many AI applications that boil down to "let AI deal with these persons so you don't have to" (or don't have to pay money to hire a human to do it). Los Angeles public schools paid $6 million to have a chatbot talk to students who needed academic and mental health help, building a wall around those students instead of bridge between them and another helpful human. The company just tanked.

In the classroom, I can skip reaching out to engage with the research and materials about the topic I want to teach. Just have AI look at the stuff and tell you what it found, mostly, kind of. 

Or the ultimate AI disengagement-- an AI writes the assigned essay for a class, and then an AI assesses the essay that the other AI manufactured, and no actual living humans engage with anything at all. 

AI threatens to foster a misunderstanding of what research and critical thinking are for. These mutated descendants of Clippy are predicated on the notion that the point is to look for a single answer which one then pours into one's noggin. Research should involve searching, collecting, evaluating, processing, and fitting together the bits of information, a process by which the researcher both fine tunes the results and sharpens and deepens their own understanding. Students have forever attempted to short-circuit that process ("Can't I just find the right answer and hand it in without all this mucking about?"). AI makes that short-circuiting simpler.

Tech and media have made it progressively easier to engage with the world; AI is a big bold step toward disengaging. AI tells humans, "Don't get up. I'll go look for you, and you just sit there and I'll bring you something." AI is not just a plagiarism engine, but a disengagement engine. A tool that moves its users away from the world instead of toward it, and there is nothing desirable about that.

Yes, maybe I am just a cranky old fart. (Okay, not "maybe") and perhaps there are ways that AI can be used to build bridges instead of walls. But my gut-level aversion to AI (and I have indeed played with it) is about this retrograde drift, this movement away from the world, the promise to build walls instead of bridges, the whole "You just stay on the couch and I will pretend to engage with reality for you" of it. I will yell at my own clouds myself, thank you.

No comments:

Post a Comment