How to Stop the Slop

6/13/2025 • 5 minute read

Author: Atanas Tonchev

AI Slop

The "Dead Internet" theory stipulates that online communities have become so full of bots that real humans have left them long ago, leaving behind themselves an endless loop of AI-agents prompting each other until they reach their daily token limit.

Even if this is not yet quite the case, it has been hard to avoid automatically generated content on social media recently: from advertisements with elaborate jingles about mental health (!) to bots trying to get our attention on messaging apps.The term 'slop' has been coined to refer to this filler content that has become omnipresent.

This means we can very rarely be certain that our online interactions are not wasting our time, feeding a database or a scam. More importantly, the Internet is ceasing to be the place with all the answers. Or rather, it has become the place where, to get to the answers, you first need to plow your way through a swamp of slop.

...must stop.

While not the sole culprit for the Internet's death, AI has certainly lent a hand in it. What is not common knowledge is that it can also help in eliminating. Indeed, it is precisely an LLM's capability to understand language that makes it the perfect tool to regulate text quality. In other words, an LLM-powered search engine can ensure we get straight to the thing we are looking for.

Looking through this lens, it is astonishing that provided the enormous amount of Internet content, we decided to use AI to generate even more. Generative AI seems to act the most like us, but it is not the most powerful function of an LLM. LLMs can be used to select texts that correspond to various queries and, more imprortantly, they can sort between well-writen, concise content and "slop".

So why not?

Yet, no matter how advanced Google's Gemini gets, Google's search seems to only erode, giving us only the page that manages to say what we are looking for the most times (which is easily cheated by an LLM). There are several potential reasons why this may be:

  • All hands on AI: The recent hype around LLMs has left a lot of other spheres unattended. While the AI arms race is in focus, a lot of other technologies have been ignored. Ironically, this has restricted many potential AI applications.
  • Cost: Indexing the Internet through LLM embeddings rather than Google's current algorithm will likely be a bit more expensive. Nevertheless, querying already indexed pages will be orders of magnitude cheaper than running every new request through Gemini's entire model.
  • Marketing: Perhaps the most likely reason - AI is easier sold as a human replacement than as an abstract linguistic tool. Thus, its abilities to organize are ignored in favour of its generative ones.

Hopefully, the AI hype is bound to recede some day, leaving space for more sensible and engineered applications of AI. Let's hope the Internet is not completely dead by then!

Word count: 506