Are you in a writing critique group? I run one, I'm in a second, and I’m curious to learn how others handle their members submitting AI-generated content. As of this writing, I have no idea how to create AI-generated content. I’m uncharacteristically not curious about learning how. Uncharacteristic, because normally new tech makes my inner nerd all giddy. This tech, when applied to writing at least, makes my inner ethicist want to vomit. I am trying to be balanced. I am often wrong about things. I need to learn more, clearly.
Image by iStock
If you’ve seen the size of my journals, you will see that one of my disabilities is that I barf out great volumes of writing. I don’t ever have writer’s block. This creates another problem: removing big chunks of text. Also, listening to my beta readers wail about how much content I’ve assigned them to read.
You can always find something to write about. Even if your writing isn't any good! Just let er rip. Did you have dreams last night? Challenge yourself to write them down. That's a start. Then, keep going. (Link: Are Your Dreams an Overlooked Resource?)
I write at least 2K words a day just journaling. Each of these represents one year of my journaling.
Don't hate me because I don't suffer from writer's block. Pity me for all the editing I need to do after the fact.
My gut feeling about allowing AI writing into writing critique groups is a Hard No. So many reasons. The anger I feel in response to people using a tool to write for them reminds me of my anger over students cheating. I was a college student for 14 years and I’ve taught college students for longer than that. In both situations when I learned there was cheating in the class, I felt sick.
When I caught some of my students cheating, they usually gave the excuse that they had no time to do the work themselves. Welcome to the club, I thought. Lots of people have no time. What makes you special?
At least two reasons not to submit AI-generated content spring to mind.
First, in addressing someone using AI to write, I would ask, have you no pride?
You are unique in all the universe. You have thoughts that only you can convey. And you want to off-source that? Doing so is disrespectful to your own extraordinary uniqueness. Also, how can you trust the material AI is creating?
Second, when it came to cheating in my science classes, the idea that some hard-working student’s work was being stolen by a lazy thief caused me outrage. In the case of AI, the stealing is indirect. It's all the scraping of our digital identities.
My husband has for the past 7 years worked to program an end-to-end encrypted system that prevents any material shared from being scraped by AI. Tim once had some of his programming appropriated and, when I met him 20 years ago, the poor guy was suing Microsoft because of that. This was a long and expensive battle which ended in a draw. We know how it feels to have your work stolen. We feel passionate about protecting original content.
My aversion to AI seems centered around using it to generate creative content: art, music, writing. AI is not all bad. AI is driving scientific breakthroughs. One of the biggest and most underappreciated breakthroughs of the decade is the AI-driven Alphafold 2. I’m staggered by what Alphafold 2 has accomplished. The developers certainly deserve the 2023 Lasker award and I wouldn't be surprised if they got a Nobel after that. Here's a little bit about why:
I can remember in grad school, in the 1990's, learning the painstaking process to elucidate protein structure. I remember thinking, wow, using X-ray crystallography or nuclear magnetic spectroscopy (I did the latter for small molecules, not proteins) is so much labor, and it might still give the wrong result. (This is because some proteins, like transmembrane proteins, exist in environments that can be hard to duplicate under experimental conditions.)
Then in the past few years, I started reading about how the shapes of countless proteins are being solved at record rates by AI. My jaw dropped. Determining the shapes of proteins is critical to developing new drugs, understanding diseases, and developing new technology. Alphafold 2 is truly revolutionary. I’d place it right up there with sequencing the human genome. It’s a game-changer in medicine and biology.
So I am not opposed to AI. I just squirm at the idea of people submitting content that isn’t theirs.
I’m aware of the argument that AI-derived content might help us learn to write better. Maybe. But transparency is essential.
Here, I have an even better idea. Study really great writing by real people, instead.
This argument reminds me of my resistance to adopting Go-To telescope mounts. These are devices that move your telescope for you to home in on whatever you’re interested in (provided it is above the horizon.) I’ve been an amateur astronomer for most of my adult life and feel like I learned the sky far better without having my telescope move all on its own, thank you. However, here I am fiddling with the damn scope for a long time with frozen fingers while I watch fellow goto-mounted scopes instantly find what I’m looking for. But I'm sure I know the sky better for doing it the hard way!
Another analogy might be the use of a satellite navigator to find your way around town (guilty). I have a bad sense of direction. I live on a peninsula and I still get lost. I suppose in some cases using certain tech might be a matter of practicality. I’m not stealing anyone’s content using a satellite navigator, but I’m also aware that I’m less likely to learn to navigate using my own brain because I’m so dependent on my satellite navigator. I don’t want to be dependent on a tool to write for me.
I’m not sure how I would recognize AI-derived writing (any tips?)*, though tools to do so are evolving. In one critique group I run, a man who I will call Mr. B, who I respect for his intelligence and integrity, playfully submitted an AI-generated story without letting us know. He simply told us to just “look out” for anything unusual. Some of my group caught on. I didn’t.
I thought the 2-page story was well-written, and was floored when Mr. B told me it came from ChatGPT. I think he did this to let us know what was going on in the world with writing. I’m grateful Mr. B did this little prank in our group. He was testing us and I failed the test.
My group, being made of honorable people, is on an honor system. My impression is that my writers feel the same as I do. If someone were to use AI, they would say so.
I’d love to hear what guidelines other writing critique groups have set in place concerning AI-derived content.
*After writing this I did find some online tools where you can past in text and have it analyzed for AI-generated content.
ZeroGPT is one. I pasted in a chapter from my book and was vastly relieved to be told that I am 100% human.
Thanks for the link to the AI writing detector! I run a writing group and used it on submissions. One person had 10 percent AI content detected. Everyone else was less than 1 percent which I guess means none. What does the 10 percent mean do you think I should do?
Posted by: Cheryl | 09/29/2023 at 12:13 PM
Hi Cheryl,
So far, (2024) these checkers are not perfect. A small percent (I do not know how small) likely means it's an error and not actual AI writing.
Having run my own writing through this AI checker, and knowing it has no AI content, I will see a little bit it picks out, usually 5 percent or so. It might help to run a bunch of tests through the checker to see how accurate it is.
Unfortunately, AI writing will get better and better, and these filter software tools will become increasingly strained to keep up.
Posted by: H P Erskine | 06/03/2024 at 07:26 PM