When the company OpenAI launched its new artificial intelligence program, ChatGPT, in late 2022, educators began to worry. ChatGPT could generate text that seemed like a human wrote it. How could teachers detect whether students were using language generated by an AI chatbot to cheat on a writing assignment?
As a linguist who studies the effects of technology on how people read, write and think, I believe there are other, equally pressing concerns besides cheating. These include whether AI, more generally, threatens student writing skills, the value of writing as a process, and the importance of seeing writing as a vehicle for thinking.
As part of the research for my new book on the effects of artificial intelligence on human writing, I surveyed young adults in the U.S. and Europe about a host of issues related to those effects. They reported a litany of concerns about how AI tools can undermine what they do as writers. However, as I note in my book, these concerns have been a long time in the making.
Users see negative effects
Has GPT-4 really passed the startling threshold of human-level artificial intelligence? Well, it depends
To assess worldwide view on bail in case of assault laced with cruelty, HC turns to ChatGPT
Academic text produced by ChatGPT formulaic, would be picked up by AI-detection tools: Study
Tools like ChatGPT are only the latest in a progression of AI programs for editing or generating text. In fact, the potential for AI undermining both writing skills and motivation to do your own composing has been decades in the making.
Spellcheck and now sophisticated grammar and style programs like Grammarly and Microsoft Editor are among the most widely known AI-driven editing tools. Besides correcting spelling and punctuation, they identify grammar issues as well as offer alternative wording.
AI text-generation developments have included autocomplete for online searches and predictive texting. Enter “Was Rome” into a Google search and you’re given a list of choices like “Was Rome built in a day.” Type “ple” into a text message and you’re offered “please” and “plenty.” These tools inject themselves into our writing endeavors without being invited, incessantly asking us to follow their suggestions.
Young adults in my surveys appreciated AI assistance with spelling and word completion, but they also spoke of negative effects. One survey participant said that “At some point, if you depend on a predictive text [program], you’re going to lose your spelling abilities.” Another observed that “Spellcheck and AI software … can … be used by people who want to take an easier way out.”
One respondent mentioned laziness when relying on predictive texting: “It’s OK when I am feeling particularly lazy.”
Personal expression diminished
AI tools can also affect a person’s writing voice. One person in my survey said that with predictive texting, “[I] don’t feel I wrote it.”
A high school student in Britain echoed the same concern about individual writing style when describing Grammarly: “Grammarly can remove students’ artistic voice. … Rather than using their own unique style when writing, Grammarly can strip that away from students by suggesting severe changes to their work.”
In a similar vein, Evan Selinger, a philosopher, worried that predictive texting reduces the power of writing as a form of mental activity and personal expression.
“[B]y encouraging us not to think too deeply about our words, predictive technology may subtly change how we interact with each other,” Selinger wrote. “[W]e give others more algorithm and less of ourselves. … [A]utomation … can stop us thinking.”
In literate societies, writing has long been recognized as a way to help people think. Many people have quoted author Flannery O’Connor’s comment that “I write because I don’t know what I think until I read what I say.” A host of other accomplished writers, from William Faulkner to Joan Didion, have also voiced this sentiment. If AI text generation does our writing for us, we diminish opportunities to think out problems for ourselves.
One eerie consequence of using programs like ChatGPT to generate language is that the text is grammatically perfect. A finished product. It turns out that lack of errors is a sign that AI, not a human, probably wrote the words, since even accomplished writers and editors make mistakes. Human writing is a process. We question what we originally wrote, we rewrite, or sometimes start over entirely.
Challenges in schools
When undertaking school writing assignments, ideally there is ongoing dialogue between teacher and student: Discuss what the student wants to write about. Share and comment on initial drafts. Then it’s time for the student to rethink and revise. But this practice often doesn’t happen. Most teachers don’t have time to fill a collaborative editorial – and educational – role. Moreover, they might lack interest or the necessary skills, or both.
Conscientious students sometimes undertake aspects of the process themselves – as professional authors typically do. But the temptation to lean on editing and text generation tools like Grammarly and ChatGPT makes it all too easy for people to substitute ready-made technology results for opportunities to think and learn.
Educators are brainstorming how to make good use of AI writing technology. Some point up AI’s potential to kick-start thinking or to collaborate. Before the appearance of ChatGPT, an earlier version of the same underlying program, GPT-3, was licensed by commercial ventures such as Sudowrite. Users can enter a phrase or sentence and then ask the software to fill in more words, potentially stimulating the human writer’s creative juices.
A fading sense of ownership
Yet there’s a slippery slope between collaboration and encroachment. Writer Jennifer Lepp admits that as she increasingly relied on Sudowrite, the resulting text “didn’t feel like mine anymore. It was very uncomfortable to look back over what I wrote and not really feel connected to the words or ideas.”
Students are even less likely than seasoned writers to recognize where to draw the line between a writing assist and letting an AI text generator take over their content and style.
As the technology becomes more powerful and pervasive, I expect schools will strive to teach students about generative AI’s pros and cons. However, the lure of efficiency can make it hard to resist relying on AI to polish a writing assignment or do much of the writing for you. Spellcheck, grammar check and autocomplete programs have already paved the way.
Writing as a human process
I asked ChatGPT whether it was a threat to humans’ motivation to write. The bot’s response:
“There will always be a demand for creative, original content that requires the unique perspective and insight of a human writer.”
It continued: “[W]riting serves many purposes beyond just the creation of content, such as self-expression, communication, and personal growth, which can continue to motivate people to write even if certain types of writing can be automated.”
I was heartened to find the program seemingly acknowledged its own limitations.
My hope is that educators and students will as well. The purpose of making writing assignments must be more than submitting work for a grade. Crafting written work should be a journey, not just a destination.
Written by Naomi S. Baron, Professor of Linguistics Emerita, American University (The Conversation)