The Rhetorical Desert of Chatbots
My first job in journalism was writing obituaries for a daily newspaper in Woodbridge, New Jersey. I was a sophomore in college, and I had just landed a summer job at the paper. The experienced reporter who trained me explained that for some people, their obituary would be the only instance of their name ever appearing in the newspaper. We have a duty, she said, to get the facts right. When she was done explaining the job, I felt like I was being entrusted with a sacred responsibility. Even though writing obits was formulaic, as the writer, I would be participating in a solemn ritual that was right at the heart of life-and-death matters.
This story was on my mind the first time I sat down to use ChatGPT. Like many people, I was astonished by what the AI could do, and I immediately began to make a mental list of the various kinds of writing a chatbot might be able to do competently. Writing obits, I thought. Yeah, a chatbot could do that.
Yes, they can, but should they? Vanderbilt University recently apologized for sending out an email about the mass shooting at Michigan State that was written by ChatGPT. The outrage was understandable. Rituals of care around death are among the oldest of human traditions. Many of us would probably be offended, for instance, to learn that a robot had dug the grave for a loved one or that an AI had written the eulogy (or the obituary), and yet we are not far off from these scenarios. The Vanderbilt story illustrates just one of the many ways in which widespread use of chatbots will unintentionally violate societal norms.
With the rise of the chatbots, those of us who care about writing are facing nothing less than an existential moment. What does writing even mean anymore if humans are no longer at the center of it? Do we need a new word to describe the syntactically sophisticated zombie paragraphs that roll out of the Big Black Box that OpenAI has gifted to the world, something other than "writing"? What will this new technology do to the writing professions? To the teaching and learning of writing? To civilization itself? I don't think anyone knows the answers to these questions, but in what could be the tagline for any Big Tech rollout these days, we're about to find out.
The ability to write is entwined with the lived reality of civilization. Writing grew up with civilization, and civilization is arguably made of text, from Sumerian cuneiform tablets used to catalog grain storage to the novels discussed in Seventeenth Century Parisian salons to the millions of workaday emails generated every day all over the world. What happens to our society when we unwind human agency and from writing? Will this pillar of civilization soften and buckle? Collapse altogether? Or will we barely even notice?
When I think about the interconnections between writing and civilization, my mind goes to my job as a college professor. I work on a new college campus, so there are still people working there who remember its inception. Every building, every course, every academic program and student organization and sports program was birthed in a proposal, and the institution is kept alive through a glorious dialogue of reports, memos, announcements, newsletters, and emails--so many emails. Each of these missives--long or short, carefully crafted or carelessly dashed off--carries with it the intention of the humans who built this school over the past seventeen years. On any given day, the place is humming with written language. Professors typing away on their monographs and (hopefully) peer-reviewed articles. Students clacking away on keyboards as they write a research paper or agonize over how to email their professors to ask for an extension. In the various offices around campus, the work of documenting things is frenzied and never-ended. It is a community of writers.
My campus is just one tiny rock in the vast bureaucratic archipelago that makes up our civilization, and there are certainly many people working within this network of institutions, agencies, and companies whose writing tasks can and will be automated by chatbots, but there are many others for whom writing is literally the vehicle of creativity and decision-making. Decision-makers work with written language at an intimate level. The people who write these emails, reports, and proposals do not need ChatGPT to show them what an email, report, or proposal looks like because they have already mastered these forms and use them in rhetorically sophisticated ways.
As a professional writing teacher, I shudder at the thought that people will be asking chatbots to provide the templates for the documents they write. I have seen some of the proposals that ChatGPT produces when prompted. They are dull, generic pantomimes at best, organized to look like proposals but lacking any of the most important elements. An excellent proposal is not a template-able thing; it is a rhetorical act, an artful confluence of research and rhetorical strategy and painstakingly written sentences, all assembled with a very specific audience and purpose in mind. The chatbots can't touch this kind of writing, but they can spit out slick simulations of it.
The best workplace communication is highly rhetorical, which means that it is done with a keen awareness of audience, situation, and genre. The current iterations of AI chatbots are relatively good at simulating genre, in a mechanical sense of the word, but they lack the ability to assess audience. They cannot "read the room" or size up the complex political or interpersonal dynamics underlying projects. They are trained in large data sets but they are utterly lacking in institutional memory or prior experience. A human prompter can certainly suggest an audience, but it will always be generic rather than specific. "Write a proposal for decreasing carbon emissions by increasing solar panel production," for instance, but if you ask it to write your particular proposal, it won't be able to do it.
I am chairing a hiring committee right now, and I've already sent out dozens of emails to candidates, hiring committee members, faculty, and administrators as part of my role. It is difficult for me to imagine how ChatGPT could have assisted me in this process. Chatbot emails are bland, generic, and impersonal, each one utterly lacking in humor, style, or even a passing understanding of anything that had come before or after it. I've had to solve a dozen or more problems already and did so using emails. I'm able to do this because I have decades of experience as a writer (another thing the chatbot lacks, experience). For me, the chatbot is not a potential helpmate, because while it does write faster than I do, it cannot do any of the other things required of a human charged with chairing a hiring committee.
And most important: it lacks the human touch. The chatbot doesn't care if my emails are carefully worded for a particular audience. It possesses no distinctive style or flair. Any sentence or phrase approximating charm or caring or sympathy is aggregated from the many thousands of emails that it has digested. The ability to use email for a particular purpose--to be rhetorical--is beyond its programming.
Steven Pinker has observed that humans did not evolve to know how to write, which makes writing a challenging activity, but we can learn how to do it and even master it. I would add something else: by learning how to write, we are learning dozens of other skills as well. The art of persuasion. The discipline of rational thought that comes packaged in the act of composing an essay or writing a set of instructions or puzzling over how to compare and contrast two different things. The vicissitudes and delicacy of humor. The author of cuneiform tablets had to consider how to organize the data about how much grain was in the city's silos and by doing this, he created a rational organizational system in his mind. The Parisian salon-dwellers grappled with the ways in which a storyteller could capture the city's rich heteroglossia in long, beautiful passages of descriptive prose, thereby inventing a language to describe culture itself. The modern email-writing bureaucrat struggles to write a paragraph that adequately conveys urgency without sounding threatening or angry, learning as she does the art of nuance.
What happens when a generation of young people learn how to be "prompt engineers" rather than writers, adept at feeding functional questions and prompts into the Black Box but not intelligent enough to formulate any of the really big or important questions themselves? It is possible that chatbots will create a new intellectual class divide in American society. At the top will be the young people who attended private schools and prestigious colleges and universities where they were taught how to write despite the ubiquity of chatbots, for the same reason that many Silicon Valley entrepreneurs send their children to schools where there were no screens in the classroom. They will be learning how to think, pose interesting questions, and solve problems. At the bottom, will be the mass of credentialed but otherwise inferior thinkers who were trained to use chatbots to write everything for them--an army of dullards who know how to fill in the bracketed spaces in documents but don't know how to write the documents themselves---a servant class of chatbot aficionados.