Over the last few years, I’ve heard a wide variety of pro-AI arguments. These generally have a few general themes:
- “My work isn’t selling and I’m willing to try anything.”
- “I’m broke and AI tools are free.”
- “AI looks and sounds just as good as the real thing, so who’s going to know?”
- “Everyone’s using it. If you aren’t, you’re getting left behind!”
- “What can it hurt to try?”
AI tools have been marketed as the Easy Button to overcome every challenge and shortcoming. It’s a hack to get ahead that doesn’t have a downside. It might even be a real-life version of God Mode.
Unfortunately, you’re being lied to. Here’s why I’ve chosen to go AI-free, and why I think you’ll want to be join me.
The Environmental Impact
At this point we should all be aware of how bad AI data centers are for the environment. They suck up scarce water resources for cooling, and place massive demands on the local energy grid– often in places where utilities already struggle to meet the needs of local residents. This drives up costs for people already struggling to make ends meet. It also increases pollution, since, according to the U.S. Energy Information Administration, fossil fuels still accounted for 58% of U.S. electricity generation in 2025.
Even when these data centers aren’t drawing their power from the local grid, it’s often not because they’re utilizing on-site renewable sources like wind or solar. Instead, as in the case of this xAI facility in Southaven, Mississippi, they’re operating dozens of portable gas turbines.
Obviously this is not the kind of industry the world needs more of.
Theft of Creators’ Work
After us humans got done gushing over a machine that could create something from nothing, we looked under the hood and realized the reality was a lot less grand. Generative AI learns by sucking up all the art, writing, video, audio, and general information it can, then regurgitating it in the medium of the user’s choice.
This is a simplified explanation, but it’s not an inaccurate one. While gen AI’s proponents argue that their models are just “learning to create” the same way a human would, that’s not accurate. A data center is not a brain, and gen AI isn’t learning what art, writing, music, or videography is so it can develop its own personal interpretation of the medium… it’s learning how to mimic existing work as faithfully as possible.
You can tell it to create an image of your favorite actor in the style of an Impressionist painting, and it will churn through everything it has digested to give you what its training tells it is the most faithful simulacrum of that. But there’s also a reason you can prompt gen AI to create something specifically “in the style of” a real creator: it’s plagiarizing. For example, Grammarly is facing a class-action lawsuit over its “Expert Review” feature, which presented editing suggestions as if they came from established authors and academics– without their consent.
The other huge problem with gen AI is that not only is it training on the work of millions of human creators without their consent and without compensation, it’s often used to actively rob them of work. While some use AI out of boredom or curiosity, many use it to avoid paying a professional or having to learn how to create a professional-quality product themselves.
Generative AI is, at its core, a sophisticated plagiarism machine. When you use it, you’re stealing the work of others– and those others are often working class artists, authors, musicians, and content creators. People in these fields who make enough money to live comfortably are few and far between. Stealing from them is indisputably wrong.
I’ve heard it argued that it’s somehow ableist or classist to say it’s unethical to use AI. I find that an interesting stance to take. I don’t think that marginalized communities deserve or are entitled to what amounts to a vastly shittier product that simultaneously steals from other working class people. It’s like saying that the best way– nay, the only way– to fix the housing crisis is to bulldoze low-income neighborhoods so they can be replaced with high density housing. Yes, it’s sort of a solution, but there are far better ways to accomplish that goal while doing far less harm.
Quality… Or Lack Thereof
If you think the work produced by AI is so good no one will ever know you’re using it… you should probably think again.
I’ve noticed that AI-generated images tend to have weird textures, strange composition, dead eyes, and extra or missing limbs. Supposedly high resolution images look weirdly pixelated up close, hinting at their origins. (Digital image files that have been copied too many times also start to look grainy and pixelated.)
I find many AI images to be simultaneously weirdly gritty and strangely smooth. These are stylistic choices, yes, but not ones an artist would generally combine in the same image. It’s like someone roughed out a digital image and then laid a beauty filter over part of it.
Writing created by AI is similarly off. Wording choices wobble between boringly bland and wildly overenthusiastic. Some AI agents get stuck on certain words, sprinkling them liberally through everything they generate. Any decent writer or editor would have caught this; gen AI, for all its supposed superiority, does not. For a more detailed look at of the significant shortcomings of AI-generated fiction, check out Sucked In By AI, an excellent article by author Audrey Driscoll.
Never mind all that, though– nonfiction is where this digital research assistant really falls short. AI agents are well known for inventing facts out of thin air, even including fictitious citations. ChatGPT, Gemini, or Claude work by predicting the next likely word based on patterns in data. The result often sounds convincing but isn’t necessarily true.
They’re also not exactly great at telling human-generated misinformation, even of the most obvious kind, from fact. An excellent –and appalling– example is a case where a researcher invented a fake eye condition called bixonimania, uploaded two obviously fraudulent papers about it to an academic server, and watched major AI systems present it as real medicine within weeks. Nature: Scientists invented a fake disease. AI told people it was real.
Temporary Gain, Permanent Cost
A recent study showed that using an AI writing assistant decreased creativity and retention. MIT Study Finds Artificial Intelligence Use Reprograms the Brain, Leading to Cognitive Decline. Essentially, it’s a shortcut that sabotages your ability to write unassisted. The same study showed that the damage to participants’ ability to come up with ideas and write them out remained long after they stopped using AI assistance. Perhaps worst of all, the use of AI made participants’ ideas increasingly similar to one another.
Enjoyment and Authenticity
I write because I enjoy writing. It’s not a chore, and it isn’t simply a means to an end. It’s a hobby that brings me great joy, and a skill that I’ve been honing for about two decades. I wouldn’t give up that process for anything. While I’ve struggled with burnout in the past and I sometimes get stuck, having someone or something else do the imagining and writing for me isn’t even remotely appealing.
Writing gets easier with time and practice. You’re never going to be perfect on day one, or year one; and if you try to use AI as a shortcut, you’re not going to improve at all.
The other problem with outsourcing your work to gen AI is that it’s not going to sound like you; it’s going to sound like gen AI. Now, I like my voice. My readers do too, and I get invited to participate in collections and collaborations because the organizers like the perspective I bring. If I outsource the work to AI, that’s gone. I would not get those amazing opportunities– because AI-generated work is interchangeably generic and absolutely everywhere.
When you market your writing, you’re selling something unique. They say every story has already been told– but not by you. Your perspective and creativity are unreproducible. Don’t sacrifice that for a cheap shortcut.
In Closing
The argument can be made that one of the reasons for the popularity of Gen AI, beyond simple laziness, is desperation. Capitalism is a treadmill, and whether you’re an artist, a writer, or you have a job that requires you to punch a timeclock, you’re never quite able to keep up. This is especially true of most content creators, where the sites we use to get our work in front of new eyes heavily incentivize churning out new creations at an insane rate.
Most of us can’t keep up with the demands of the algorithm, and we fall off the treadmill. We get burnout. We sometimes lose our ability to create, and to keep a roof over our head. Enter the tech billionaires with a solution to the problem they created: What if there was a magic button that could generate content for you??
I think it says something rather dark about humanity that many of us would rather find more efficient ways to run on the treadmill, even if it means stealing the work of others and destroying our future, than break the cycle. In a future post, I’ll lay out some suggestions for how to break the treadmill.
Until then, I hope you’ll join me in going AI-free.
