By: Samuel Guerra
I opened Facebook for the first time in a while and saw this image: a golden retriever wearing aviator sunglasses, floating on a neon surfboard over a sunset ocean. “This made my day!” the user captioned it. The image was generated by AI.
It said nothing. It meant nothing. And yet, by the number of reactions and interactions, it probably traveled farther than most human thoughts ever do.
We are drowning in “slop.” In case you happen not to know this term, Cambridge Dictionary defines it as “content on the internet of very low quality, especially when it is created by artificial intelligence.” AI-generated content stripped of intention, history or consequence. It’s filler. And it’s everywhere.
From AI-written birthday wishes to synthetic landscapes captioned “peaceful vibes,” this content floods our feeds. And I have to admit: some memes have been entertaining. But what I find the most pointless is when AI images are used to represent something that exists in real life. Like generated photos and videos of baby animals in which real content exists in abundance. Or the image of a well-documented historical event, but AI-generated instead of showing an actual historical photo.
There is something called “dead internet theory,” which warned years ago that most online activity would come from bots, algorithms and AI-generated noise rather than from real humans. This theory used to sound like paranoia. Today, it feels like an observation.
On Sept. 30, OpenAI, the company behind ChatGPT, launched its own “social media” app called Sora, often called the “TikTok of AI.” In this app, users scan their faces as input, and the app generates AI-videos at the request of the user. Positioning the user’s face in any scenario.
Many users describe it as “peak AI slop” that merges mindless scrolling with algorithmically generated clips. After its release, it was the #1 most downloaded free app in the App Store for 21 consecutive days. Days after the app’s launch, OpenAI CEO Sam Altman announced plans to allow “even more erotica for verified adults,” pushing even further the boundaries of what AI-generated videos will be used for.
We were originally promised AI would be an extraordinary new technology that would revolutionize many aspects of daily life, from big science questions to annoying daily tasks. But now the largest AI company in the world is using it to create and monetize explicit content. Beyond erotica, the new app is designed for nothing but dopamine surges. No casually messaging your friends or sharing about your day. It’s sensory bait, engineered to keep users hooked while their attention spans fry. That’s why I think even the term “social media” should not be used for it.
And who’s consuming “slop” most? Often, it’s older generations. Turns out that the people who grew up reading newspapers and watching broadcast TV are now unwittingly shifting to this content because it’s bright, simple and emotionally frictionless. They may not know it’s AI. They just know it “feels nice.”
Meanwhile, a contrast emerges when we observe how influential people approach this phenomenon. Tech insiders are sending their children to no-tech schools, banning smartphones at home and delaying screen exposure as long as possible.
Bill Gates and Steve Jobs famously restricted their kids’ tech use. Why? Because they understood years ago what the rest of us are only beginning to see: attention is the scarcest resource of the 21st century, and it’s being strip-mined.
This is the new class divide. The wealthy protect their children’s minds. The rest of us get Sora.
The irony is brutal. Those who built the attention economy are opting out of it; the rest of us, including our elders, are left to scroll through an increasingly addictive digital landscape.
We don’t need to ban this technology. But we must refuse to normalize its output and not treat it as human expression.
A golden retriever in aviators isn’t joy, it’s a counterfeit. And if we keep treating fakes as real, we won’t lose just the internet—we’ll forget how to recognize Truth when we see it.
