Answering the Biggest AI Objections (and Fallacies) Authors Are Falling For
AI vs. Authors: Debunking the Myths, Fallacies, and Fear-Mongering
What if most of the hot takes about AI and writing… are missing the point? In this post, let’s go over 12 of the biggest objections authors have about using AI—and why most of them just don’t hold up.
If you’ve ever felt overwhelmed or shamed by the AI debate, this is for you. I’m here to help you think differently (maybe more clearly), use better tools, and keep the focus on what actually matters: telling your story, your way.
Because writing isn’t just about output—it’s about who you’re becoming.
We’re in a weird time. Writers are caught in a cultural riptide between two loud extremes:
On one side, the techno-utopians tell us AI is the future and we should outsource our creativity to machines to "write more, faster, better."
On the other, the techno-doomsayers warn us that using AI is plagiarism, soul-erasing, and the fast track to the collapse of literature as we know it.
Neither camp gives creatives much room to think critically. You're either a sellout or a purist. A tool-user or a tool of the system.
And with the fallout from NaNoWriMo, thousands of authors are left wondering what the smart or prudent way forward is.
At LegendFiction, we believe in a third way. Authors can thoughtfully integrate new tools, preserve their artistic integrity, and build meaningful worlds.
I’m a big advocate for responsible AI use, and that artists are the best people to be using AI. That’s why I’m helping creators to find their own freedom, eliminate tedium, and preserve their creative energy for the things that matter most - their actual writing. Everything else is busy work.
Here’s my breakdown of the most common anti-AI arguments in writing circles, with responses aimed at helping authors avoid emotional manipulation, spot logical fallacies, and make informed, confident decisions about their craft.
And I’m going to use em dashes, because I like them. Deal with it.
1. “AI is unethical. Period.”
“These LLMs only exist because they stole data... They trained their machines on our books without our consent.”
The Fallacy: Black-and-white thinking + Loaded language
This argument frames the entire field of AI as inherently immoral. The truth is, the ethics of AI depend entirely on how it's developed and used. Just as libraries and search engines index content to make it more accessible (and aren’t considered thieves), AI models learn patterns—they don’t replicate content verbatim unless prompted to.
Training on public data is not new. Humans do it too. Every time you read five novels in a genre and imitate their structure or style, you’re “training” yourself.
When are artists going to start crediting and sharing payouts with their sources? …I’ll wait.
I won’t argue that unethical practices went into how these tools were started. But if you know anything about some of the ingredients in the Great Wall of China, you’d rethink your vacation selfies.
Most of us have no control over how something gets made and added to the market—like a highway being eminent domain through someone’s historical treasury. But as much as we regret it, a point comes where we have to get back to living, working, and doing stuff. The real question is: what are you learning from the past, and what kind of future are you building to make it better?
LegendFiction’s Third Way: We advocate for responsible AI use, like any tool. But we reject the idea that using AI is, by definition, unethical. Instead, we teach authors how to use it responsibly—as an assistant, not a replacement.
2. “Saying AI helps the disabled is insulting.”
“Generative AI is not a tool to help those with physical or mental disabilities… I know this because my friends said so.”
The Fallacy: Appeal to personal anecdote + Dismissal of lived experience
The writer invalidates a category of assistance because it didn’t apply to her circle of friends. That’s not an argument. People with dyslexia, ADHD, chronic fatigue, language barriers, or neurodivergence have already reported using AI to help brainstorm, organize, and express themselves.
No one is saying AI should replace accessibility tools—but for some, it can be one. And writing off a whole group’s tool because it doesn’t fit a specific narrative is exclusionary.
Plus, it’s up to each creator how much they want to augment their process with AI. Some people carve ice with ice picks. Some use chainsaws. Both are valid… just don’t bring a chainsaw to an ice-pick competition. It’s a different league.
LegendFiction’s Third Way: We honor the diverse ways people create. If AI helps someone outline their book, that’s a win. And if someone never wants to touch it? Also valid.
For our community, the point is to skill up in the actual discipline of writing. So don’t use AI to write your stories, otherwise why are you with us?
3. “AI can’t help people find their voice. It replaces it.”
“You cannot find your voice by having a machine do the work for you.”
The Fallacy: False dilemma
This assumes AI use = outsourcing your voice. But voice isn’t something a machine gives you—it’s something you choose. Every writer who’s ever rewritten a line generated by AI, tweaked a sentence, or tossed out a bland suggestion is developing their voice in real-time.
Your voice gets stronger the more you interact with different styles, tones, and rhythms—AI can act like a sandbox for that exploration.
Plus—and lets’ face it—books and writing is essentially a transcript. It’s an invented form of communication. It’s not dogma, or canon. It can evolve, and it has. The tools we use also evolve.
LegendFiction’s Third Way: Don’t let the machine dictate your stories. Use it to help you interrogate, adapt, and own every sentence—making AI a prompt, not a puppeteer.
4. “AI won’t make anyone a better writer.”
“You learn to be a better writer by doing the work… that will never happen if you’re plagiarizing.”
The Fallacy: Equivocation + Guilt by association
Equating using AI with plagiarizing is a category error. Writing tools have always helped with grammar, brainstorming, and structure. What matters is how you use it.
Heck, ghostwriting is a long a noble tradition.
A first draft generated from a prompt isn’t a finished book. AI can help structure thoughts, outline scenes, or suggest transitions—but you still should do the work to refine, personalize, and finish.
That being said, some people will never have the time or the abilities to learn the skill of writing. Maybe because time is tight, or life is impossible. Should their insights never be heard? Should they never have a chance to communicate? No. Finally, everyone has a chance to communicate more coherently, no matter your background.
Yes, I know students are using ChatGPT to write their college papers. That’s a different conversation, one more about how education is structured, and how it needs to improve or evolve anyway.
LegendFiction’s Third Way: Authors can use AI like a sketchpad, not a copy machine. If it helps you move forward and you revise it into your own, it’s a tool—like using Photoshop instead of a paintbrush.
5. “Getting feedback doesn’t require AI. You can get it free in critique groups.”
“There are absolutely free ways to have humans help you improve your writing.”
The Fallacy: Red herring + False equivalence
This assumes that AI is replacing human critique—and that writers are lazy or misled for using it. But AI isn’t trying to replace your writing group. It’s a first-pass editor for your 2am draft. A practice partner. A warm-up.
Also: not everyone has time, energy, or social access to consistent critique partners. AI can offer feedback immediately, anytime.
So do both. Preferably both.
LegendFiction’s Third Way: We encourage community—but we also recognize that AI can be a lifeline for isolated or time-strapped writers. Both can co-exist.
6. “Creative writing has value even if it’s unpublished. AI devalues the creative process.”
“It removes creativity from the process. It makes you an assembler, not an artist.”
Fallacy: False dilemma + No True Scotsman
This assumes that creativity only exists when everything is made from scratch and that using tools disqualifies you from being “an artist.” But creative work has always included repurposing, remixing, and iterative work. Assemblers are artists—just look at collage, remix culture, or screenwriters.
Some authors never need to share their work. They write as a form of active meditation, or creative contemplation. That has value even if no one ever sees it.
Publication is one possible metric of valuing creativity. Personal satisfaction is another.
But AI doesn’t devalue all creativity. If you are generating images and calling yourself an artist, then check yourself. If you’re dropping a prompt and running off an entire book on a topic you never researched, and calling yourself an author, yeah, check yourself.
If you’re using AI to speed up and eliminate tedium in your process? That’s what every creator does. How would Leonardo da Vinci use AI? It’s worth thinking about. I doubt he’d dump it. Creators have always used every stepping stone possible to see higher and farther and deeper. And that includes faster.
Frankly, that’s why we have schools.
Using AI in your creative process has nothing to do with the general act of creativity. They may help some write better, faster, or with more support.
LegendFiction Response: Creativity isn’t the absence of tools. It’s what you do with them. We train writers to maintain authorship and intentionality, even when using AI as a creative partner. You still choose the direction, the values, the voice. You’re not assembling—you’re architecting.
7. Objection: “The publishing industry is broken, and AI is not the solution.”
“The problem is not a lack of output. The problem is a lack of equitable access to publishing avenues.”
Fallacy: Red herring
There probably is an issue with inequity in publishing—but that’s a separate. Dismissing AI just because it doesn’t fix everything is like dismissing medicine because it doesn’t cure poverty. AI doesn’t solve systemic injustice—but it does give individuals more tools to work around gatekeepers.
LegendFiction Response: Publishing is probably/definitely broken. That’s why AI matters—because it allows under-resourced, under-networked, under-confident authors to write, revise, and publish faster and smarter, without waiting for legacy gatekeepers. It’s not a fix—it’s a force multiplier.
8. Objection: “AI-generated books are flooding the market and devaluing real writing.”
“Amazon has had to pull AI-generated books. Bookstores are facing floods of these products.”
Fallacy: Guilt by association + Slippery slope
This conflates spammy, low-effort AI garbage with all AI-assisted writing.
Of course there are bad actors using tools to pump out content—but that’s a human problem, not a tech one. We already have this issue with ghostwriters, plagiarists, and Kindle scammers.
LegendFiction Response: We don’t support junk production. We support disciplined creativity. Spam is spam, no matter the source. We teach creators how to stand out with quality, heart, and purpose—whether they use AI for brainstorming or not.
9. Objection: “AI is an existential environmental threat.”
“AI models consume absurd amounts of electricity and resources to function.”
Fallacy: Tu quoque + Whataboutism
This is worth addressing, but it must be placed in context. Data centers also power cloud storage, video streaming, and online banking. AI use should be measured and regulated—but blaming individual creatives for global energy use is a deflection.
Plus, authors using AI are a miniscule fraction of it’s use. We’re a polyp on the hide of a tech race between two world superpowers that will define the next 500 years of human history.
This is not going away, and we need smart people learning how to amplify, augment, and use these tools.
LegendFiction Response: We care about sustainability. But the environmental burden of AI is not borne by individual authors using GPT for 15 minutes. If we’re serious about change, we need smarter policies and infrastructure reform—not creative paralysis. Use your tools. Stay informed. Don’t carry the world on your WIP.
10. Objection: “The only people defending AI are those with a financial interest in it.”
“When you hear people defending generative AI, check if they’re being paid.”
Fallacy: Ad hominem
This undermines valid arguments by attacking the motivations of those making them. Just because someone benefits from a technology doesn’t mean they’re lying about its value. By that logic, no author can endorse traditional publishing either—because they “profit from the system.”
It can go the other way, too. “Check if someone is being paid to obstruct something.” Because hired actors at rallies to hijack messages is a matter of public record.
But let’s not stoop to that. Get away from personal attacks, and let’s deal with the topics and ideas.
LegendFiction Response: Our community has no financial stake in OpenAI or Amazon algorithms. What we care about is creative empowerment. We’re defending agency—the right to think clearly, choose your tools, and build your voice without being shamed into silence.
11. Objection: “Writers who use AI don’t respect the craft.”
“If you respect the work, the art, the value of it—then you should not support these technologies.”
Fallacy: Appeal to purity + No True Scotsman
This is a purity test masquerading as a moral high ground. But respecting the craft doesn’t require staying stuck in past methods. Writers respected the craft when they adopted typewriters, then word processors, then self-publishing platforms. Respect shows up in how you use a tool—not whether you use it at all.
LegendFiction Response: We believe craft is sacred. And we think it deserves access to the best tools available. It’s not a betrayal of tradition—it’s an evolution of it. We teach authors to elevate their work using whatever helps them grow.
12. Objection: “AI won’t go away unless we refuse to use it.”
“I won’t support NaNoWriMo if they won’t take a firm stand. Neutrality is betrayal.”
Fallacy: False cause + Moral absolutism
Refusing to use AI doesn’t dismantle the tech industry. It just isolates the writer. And demanding that communities excommunicate tool-users to prove their virtue is moral absolutism, not leadership.
LegendFiction Response: We support dialogue, not dogma. We won’t punish people for being curious or cautious. We invite every writer—traditional, tech-savvy, and everything in between—to co-create a future worth writing about.
Your Craft Is Bigger Than Your Fear
Let’s be honest: creative work has never been safe. It’s always been risky, messy, misinterpreted, and open to theft, failure, or flat-out silence. And yet—we keep writing. Not because it guarantees control, but because it guarantees meaning.
The rise of AI doesn’t change that.
Yes, there are ethical questions worth wrestling with. Yes, we need boundaries, standards, and deeper conversations.
The answer is to stay human. Stay curious. Stay committed to the work.
LegendFiction exists to help authors think critically, use tools wisely, and tell legendary stories—whether you use a pen, a prompt, or a model trained on terabytes. We’re not here to replace writing. We’re here to protect it from becoming irrelevant.
In the end, your creativity isn’t threatened by technology. It’s only threatened by fear—especially the fear of changing how you think, grow, and tell the truth.
So ask better questions. Use better tools. And write like it still matters—because it does.
Thank you for the note about disabled people. I have a neurological movement disorder, which as you can imagine, makes writing hard. I can't read my handwriting, and have twitches in my lounge that prevent speech to text all the time. And typing hurts when done too much.
I don't like to use ai when I'm drafting at, but all the notes and work before writing? All the editing notes I want to make but find hard to put on paper? AI is easy to work with as a brainstormer, summarizer, or even editing tool if it's trained right.
Thanks for the nuance. A lot of anti-AI discourse is in the form of memetic text or recycled talking points (slop, stolen, soulless).
Which is obviously and bitterly ironic.