Part 7: Don’t Make Me Come Over There: F*ck AI
Content Warning: This post contains a much higher than average number of curse words. I will also try to make this a little shorter than usual. I’m sure you’ll get the point by the end of it.
* drags out the soap box, steps up *
* clears the throat *
* cracks the knuckles *
AI is not actually “artificial intelligence” in that there is literally nothing about it that includes intelligence at all. In the realm of writing, the AI applications are actually Large Language Models (LLMs) that only put together words in a certain order because those words statistically seem to go together. LLMs have no fucking clue what they’re writing. They are throwing shit against the wall in a way that the algorithm predicts will stick. It looks convincing, these approximations of language into “better” formats, but there are oh so many problems with even using them passingly.
And don’t start with that “I’m just using it to check my work” bullshit because if you’re trusting an LLM to “check your work”, you’re telling on yourself and the big, fat hole in your education.
The Writing Event That Shall Not Be Named really put their foot in it this past year by suggesting that using AI to write was an “accessibility device” because “some people don’t write well.” (This is outside of their Other Problems.)
My good bitches, the whole point of writing is to learn to write well. If you have some kind of natural disability that prevents you from writing well, get a writing buddy or do literally anything else as a hobby.
Me? Alarmist?
Well, no shit, I was raised on science fiction that frequently explored the question of whether or not an artificially created intelligence could gain self-awareness (sentience) and possibly sapience (able to logic in a similar fashion to humans), and then realize how absolutely shitty humans are and try to destroy us.
But – and here’s where the cautionary tale probably goes a little awry for me – I never identified the problem as “AIs are dangerous and we have to prevent their development because destruction bad.” My take-away has always been that humans, especially in vast civilizations, are mostly trash, and we never learn our lessons about being ethical and compassionate, so of course if we’re going to build machines that emulate us, they’re going to suck at being decent entities. I would even posit that this idea of trash-humans creating trash-AIs is holding up pretty well.
Take the employment industry. It is notoriously difficult for completely qualified individuals to get (legitimate) jobs because companies that employ AIs in their hiring process auto-reject applications on a wide variety of sometimes arbitrary metrics. And then there’s the problem of Cat I Farted (say it in French, you know what I mean) making up fake legal cases and the perpetrators facing negligible legal repercussions for it. The list of egregious and potentially dangerous mistakes that often manifest as discriminatory and bigoted biases goes on and on. (Hey, trash humans wrote it, trash human behavior comes out, right?)
And, yes, I acknowledge the little triumphs such as the pastry identifier that turned out to be really good at detecting cancer cells, but that’s one of those cases where AI was accidentally used for the one thing that it should be doing, the one thing that it should be better at than humans, and that is detecting anomalies in patterns.
But, instead of using AI to diagnose complex diseases, AI is being used to steal the thing that makes humanity bearable. Writers and editors are being “downsized” (as in, completely replaced) with AI apps at such an alarming rate that people who have been the backbone of documentation and content for decades are out on the street – all for the sake of “saving money” (for companies that are almost always obscenely in the black).
Remember how I mentioned that KDP doesn’t care about your success, just your volume? The fucking idiots who replace entire editorial teams with AI aren’t doing so because they think that AI writes better copy, they’re doing it because they don’t have to pay AI a salary or give it paid time off or wait for them to get back from paternity leave. They can crank out ridiculous amounts of content in a fraction of the time and they don’t give a fuck that it’s just objectively awful.
“Oh, but then that makes a whole industry for people to fact-check AI and learn how to build prompts—”
Bitch, that’s just adding seven extra steps to what we already did, except now instead of writing our own creative copy, we’re checking the homework of a sociopathic liar with serious hallucination problems, and that takes our energy away from being actually creative.
Using AI to Makes You Dumber
I know, big threats, but hear me out:
Especially in our field as writers, our capacities are expanded by using our skills, by honing them, by learning new things, by forcing ourselves to look at things differently. We become stronger and better and express our ideas more effectively when we let ourselves struggle and make mistakes.
I am not the only person who feels this way. Students in the USian education system, facing little to no negative repercussion from using AI, have essentially replaced actual learning with generative technologies. When given the opportunity to offload cognitive tasks, humans notoriously rely on external tools to a fault. Over-reliance on any technology can be a major problem, but especially with the tendency to not fact-check AI generated results – it’s supposed to be saving time, right – the opportunities to use critical thinking and analytical problem solving decline rapidly.
I’ve noticed this even with people I know personally. One person who shall remain nameless (you know who you are) started using AI to generate ideas because they felt they’d hit a creative wall. That gave them a few ideas, but Cat I Farted wasn’t super-clear on the parameters, so they asked for a sample. It came back with a fantastic block of prose – that was also 90% plagiarized. They didn’t catch it at first, but then an article about Cat I Farted lifting scenes from books inspired them to run it through a plagiarism checker, and sure enough, stolen goods. The problem wasn’t just that the snippet was nicked from someone else, it was that the entire process of trying to use AI to solve a creativity problem interfered more with them getting their groove back.
What could they have done differently? As I mentioned in the last part of this series, when your word bag is empty, read a book or twelve to fill it back up. Walk away from the keyboard. Make some sourdough starter. Paint something. Play a video game. Get away from the pressure and let your brain incubate ideas.
My Two Cents (Adjusted for Inflation)
For the sake of your long-term sanity, even in the face of growing fields of AI-positive propaganda, please do not rely on AI for anything. Yes, I know that tools like Grammarly use AI to point out your errors and suggest improvements to your writing, but remember what I said about holes in your education? The USian school system isn’t exactly a glowing bastion of forward thinking and critical cognitive skills, so I totally understand if you don’t know your adjective from your adverbs, but you are a human being with a human brain, and you can learn.
Going back to my point last week, especially with creative writing, you need to know the rules so that you know how to break them without losing the flow of your story. You can fill in these gaps so easily, and once you know them, you know them forever. Use them all the time, use them when you like, but it’s all part of your brain now.
The glorious Noam Chomsky started his illustrious career as a linguist by describing that language’s first role in human development is for cognitive structuring, and only then for communication. In practical adult-life, that means that the better you become at expressing yourself on the page, the better you are at organizing your thoughts, at understanding the world around you, and at grasping your own personal experience.
Don’t let those fucking tech bros take that away from you.
EDITED TO ADD: Also in the spirit of my beloved Noam, insisting on organic, non-AI artwork is a way of undermining the capitalist system and returning the means of production to the people.
EDITED AGAIN TO ALSO ADD: After I finished penning all this, Freya Holmer released an excellent hour-plus-long video called “Generative AI is a Parasitic Cancer“, and I recommend you give it a gander.
Thank you all for going on this long journey with me! I know this last one doesn’t seem to apply directly to self-publishing, but it does relate directly to your capacities as a writer. If anyone is looking to partner up in the Fort Worth area for a slightly subversive and generally hilarious writing group, hit me up.
Get caught up: Part 1 – Part 2 – Part 3 – Part 4 – Part 5 – Part 6 – Part 7