I’m not alone in seeing how the internet has been degenerating over the years. The first poison was capitalism: once money became the focus of content, a content that was rewarded for volume rather than quality, the flood of noise started to rise. Then it was the “algorithm”, initially a good idea to manage the flow of information that was quickly corrupted to game the rules. SEO became a career where people engineered that flow to their benefit. And Google smiled on it all, because they could profit as well.
The latest evil is AI, which is nothing but a tool to generate profitable noise with which to flood the internet, an internet that is already choking on garbage. Now AI is beginning to eat itself.
Generative AI models are trained by using massive amounts of text scraped from the internet, meaning that the consumer adoption of generative AI has brought a degree of radioactivity to its own dataset. As more internet content is created, either partially or entirely through generative AI, the models themselves will find themselves increasingly inbred, training themselves on content written by their own models which are, on some level, permanently locked in 2023, before the advent of a tool that is specifically intended to replace content created by human beings.
This is a phenomenon that Jathan Sadowski calls “Habsburg AI,” where “a system that is so heavily trained on the outputs of other generative AIs that it becomes an inbred mutant, likely with exaggerated, grotesque features.” In reality, a Habsburg AI will be one that is increasingly more generic and empty, normalized into a slop of anodyne business-speak as its models are trained on increasingly-identical content.
After all, the whole point of AI is to create slop that will be consumed because it looks sorta like the slop that people already consumed. So make more of it! We’re in competition with the machines that are making slop, so we can outcompete them by just making more of it. It’s what biology would look like if there were no natural selection, and if energetic costs were nearly zero — we’d be swimming in a soup of goo. As Amazon has discovered.
Amazon’s Kindle eBook platform has been flooded with AI-generated content that briefly dominated bestseller lists, forcing Amazon to limit authors to publishing three books a day. This hasn’t stopped spammers from publishing awkward rewrites and summaries of other people’s books, and because Amazon’s policies don’t outright ban AI-generated content, ChatGPT has become an inoperable cancer on the body of the publishing industry.
That’s a joke. Limiting authors to three books a day? How about limiting it to one book a month, which is more in line with the human capacity to write? You know that anyone churning out multiple books per day is not investing any thought into them, or doing any real research, or even aspiring to quality. Amazon doesn’t care, they exist only to skim off a few pennies of profit off each submission, so sure, they’ll take every bit of hackwork you can throw at them. Take a look at the Kindle search page sometime — it’s nothing but every publisher’s slush pile amplified ten thousand fold.
The Wall Street Journal reported last year that magazines are now inundated with AI-generated pitches for articles, and renowned sci-fi publisher Clarkesworld was forced to close submissions after receiving an overwhelming amount of AI-generated stories. Help A Reporter Out used to be a way for journalists to find potential sources and quotes, except requests are now met with a deluge of AI-generated spam.
These stories are, of course, all manifestations of a singular problem: that generative artificial intelligence is poison for an internet dependent on algorithms.
The only algorithm I want anymore is “Did PZ Myers subscribe to this creator? Then show the latest from them.” I don’t want “X is vaguely similar to Z that PZ Myers subscribed to” and I sure as hell don’t want “Y paid money to be fed to everyone who liked Z”, but that is what we do get.
One hope is that all the AI-based companies will eventually start cannibalizing each other. That may have already begun: two AI image companies, Midjourney and Stability AI, are fighting because Stability skulked into the Midjourney database to snatch up as much of their data as they could.
Here’s a prompt for you: two puking dogs eating each other’s sick and vomiting it back up again, over and over.










