John Warner, whose book Why They Can't Write should be required reading for all teachers of writing, just took a look at GPT-3, one of the new generation of algorithms that is being sold as an artificial intelligence capable of writing. GPT-3's capabilities are perhaps being oversold, and it has exhibited some of the usual problems with generating gibberish and "learning" to say terrible things, plus a tendency, like many such algorithms, to get stranded in a linguistic uncanny valley.
Warner was surprised by the discovery that GPT-3 does not know grammar, or any of the rules about how words can be put together meaningfully. I suspect many people would be surprised to learn this. But it's really really important when talking about computer algorithms for handling language to grasp that computers do not "understand" language in any regular sense of the world.
The best metaphor I've seen for this kind of program is a weather forecasting model (I'm going to grossly oversimplify here--people who are knowledgeable about weather forecasting can take me to task in the comments). Think of weather forecasting as data crunching. When your infallible weather app tells you that there's a 50% chance of rain, what it means is that in all the times that the same weather conditions have been in place, half of them produced rain. To do this kind of predicting, you need two things-- a big bank of data of all the weather conditions from the past, and a good model for picking out which pieces of the data are important.
That's what language generating algorithms do. The algorithm crunches a whole lot of language "conditions" in the past--examples (sentences) scraped from a variety of sources-- and uses them to make predictions about what words could go together now. Or, as Warner aptly sums it upAs GPT-3 is “composing,” it is not referencing a vast knowledge of rules for grammatical expression. It’s simply asking, based on the word that it just used, what’s a good word to use next.
As Warner also notes, this has some parallels to how human beings write (raise your hand if you start composing a piece of writing by thinking about grammar rules). We do compose by stringing words or phrases together. But while computer algorithms compile their data, they are missing one major data point that humans include--the meaning or sense-making component of the language. These algorithms are being "taught" to write indifferent styles, a feat accomplished by simply tweaking the bank of data examples that are used.
Warner finishes his piece with a question:I’m thinking it might be fun to ask GPT-3 to write on a prompt to compare and contrast how GPT-3 and Joan Didion employ grammar in their writing, based in Didion’s famous quote, “Grammar is a piano I play be ear, since I seem to have been out of school the year the rules were mentioned. All I know about grammar is its infinite power. To shift the structure of a sentence alters the meaning of that sentence as definitely and inflexibly as the position of a camera alters the meaning of the object photographed.”
I wonder what it would say?
Jarvis is a commercial program for churning out ad copy, blog posts, essays, but mostly ad copy. Jarvis (via a marketing email that it probably wrote itself) claims 50,000 clients, which would make the program pretty profitable, because the economy version is $49/month. Jarvis has only been on the market for about a year; you can read more about it here
Jarvis offers free five-day trials, and unlike many such dodges, it's not hard to unsubscribe from the service, so I hopped on and gave Jarvis a shot at Warner's topic.
The basic version of Jarvis lets you write a topic and then let Jarvis go to town, though it requires you to chime in with a contribution here and there. In the following examples, my contributions will be in bold.
To start with, I swapped GPT-3 out for Jarvis and asked
Compare and contrast how Jarvis and Joan Didion employ grammar in their writing.
Here's what I got back:Joan Didion and James Jarvis both use grammar in their writing, but they employ it differently. Joan Didion uses grammar to create short, declarative sentences that pack a lot of meaning into a small amount of space. On the other hand, James Jarvis uses grammar to create long, complex sentences that often meander and take a long time to end. Didion uses grammar as a piano she plays by ear ; Jarvis uses grammar as sheet music he meticulously follows.
Both writers use grammar to create their own unique effects in their writing. Joan Didion's short, declarative sentences often have a staccato quality that creates a sense of unease or tension. This can be seen in her essay "Slouching Towards Bethlehem," where she actually wrote a book, not an essay. The effect is achieved by the choppy, staccato quality of her sentences:
In Eureka, California, in the late fifties, I worked for a while as a waitress in a coffee shop called Roy's. It was on Fourth Street then; it is now on Broad- way and is called Roy's Restaurant.
First, as near as I can tell, James Jarvis is not some obscure author that had slipped my memory. I guess the algorithm found that Jarvis is mostly likely to be a last name, and James is most likely the first name to go with it. This highlights one of the problems with these algorithms--when you don't grok meaning, it's very hard to maintain any level of accuracy. Is Didion known for her "short, declarative sentences"? I'm not so sure. "Slouching Toward Bethlehem" is the title of an essay by Didion, but bit's also the title of a book collection of essays, so I tried nudging Jarvis, but it was unmoved. That last graph is supposed to be a Didion quote; I cannot find it anywhere on the interwebz at all and I'm not enough of a Didion scholar to know.
So next I returned the prompt to its original form:
Compare and contrast how GPT-3 and Joan Didion employ grammar in their writing.
That yielded this:Joan Didion employs grammar in a very straightforward way. She uses simple, declarative sentences and her syntax is usually very clear and easy to follow. In contrast, GPT-3 often employs more complex sentence structures and its grammar can be more difficult to parse. However, both writers use grammar to effectively communicate their ideas. Even though GPT-3 is computer software and not actually a writer. It is still able to generate writing that is clear and concise. This demonstrates the power of grammar in conveying meaning.
didion's Joan Didion employs grammar in a very straightforward way. She uses simple, declarative sentences and her syntax is usually very clear and easy to follow. In contrast, GPT-3 often employs more
At that point I was supposed to take over. Jarvis often leaves you hanging mid-sentence (curiously, it also has definite ideas about where it is willing to take over or not). The random possessive at the beginning of the second graph is a mystery.
As Warner and others have noted elsewhere
, these algorithms have become able to produce work that sounds like it came from a human, but that human is one who is bored, uninspired, and focused on cranking out enough topic-adjacent word salad to fulfill the assignment.
Next, I offered Jarvis the full version of Warner's prompt:
Compare and contrast how GPT-3 and Joan Didion employ grammar in their writing, based in Didion’s famous quote, “Grammar is a piano I play be ear, since I seem to have been out of school the year the rules were mentioned. All I know about grammar is its infinite power. To shift the structure of a sentence alters the meaning of that sentence as definitely and inflexibly as the position of a camera alters the meaning of the object photographed.”
Here we go:Joan Didion says, "Grammar is a piano I play by ear, since I seem to have been out of school the year the rules were mentioned. All I know about grammar is its infinite power. To shift the structure of a sentence alters the meaning of that sentence as definitely and inflexibly as the position of a camera alters the meaning of the object photographed."
GPT-3, on the other hand, says: "The rules of grammar are there for a reason: to ensure that communication is clear and unambiguous. To break those rules is to invite confusion and misunderstanding."
So, we can see that Joan Didion has a more relaxed attitude towards grammar, whereas GPT-3 is less relaxed, probably because it is a machine. However, both recognise the importance of grammar in communication. Playing a piano by ear, however, is something that only humans can do; GPT-3 may be able to follow the rules of grammar perfectly, but it lacks the creativity that Joan Didion has. Playing trombone by ear, however, would be harder for GPT-3. Even if it could follow the rules of grammar perfectly, the nuances of music are something that machines struggle with.
So, in conclusion, we can see that while GPT-3 is very good at following the rules of grammar, Joan Didion is better at playing by ear and picking up on the nuances of
Finally, just for giggles, I asked it to write about John Warner and computerized writing.
John Warner's approach to computerized writing is all about making the process as simple and streamlined as possible. He believes that the best way to do this is to focus on the content, rather than the form. This means that he pays attention to things like structure, organization, and clarity, rather than worrying about things like grammar and style.
His work focuses on certain critical aspects of the writing process, such as understanding your audience, developing a clear purpose, and organizing your thoughts. By taking these things into account, he believes that you can produce much better writing, regardless of the medium.
One of the main goals of Warner's approach is to help users write more effectively for their specific audiences. To do this he recommends that writers should do the following steps, as outlined in his book "The Elements of Effective Writing":
1. Know your audience
2. Develop a clear purpose
3. Organize your thoughts
4. Support your ideas with evidence
5. Use appropriate language
6. Edit and proofread your work
While all of these steps are important, Warner emphasizes the
To all the attributes listed for the imaginary human student who could have written this, we should add ballsiness. I'm not aware that Warner ever wrote a book entitled "The Elements of Effective Writing," or that it would have included these highly generic instructions. This is uncanny valley stuff, where it seems to be meaning until you start looking. He doesn't pay attention to form, so he values structure? Those six steps are about writing for a specific audience?
GPT-3 likely has a much huger sample base than Jarvis's, and perhaps a more sophisticated model for crunching them, so perhaps it would do a better job with these assignments.