If you haven’t taken Patrick’s advice to turn off your social media until after the election yet, then you might have stumbled across some posts from Trump and Harris that are a little different than your typical political ad.
The Trump campaign is apparently embracing the use of “AI slop” in their marketing. Check out this example from Trump’s X account.
It doesn’t take a specialist to see that this image is AI-generated. The Trump team decided instead of using pictures of actual immigrants, they would create a fake image of a scenario that frightens their base in order to make their point. For Trump and, I imagine, a slew of his most loyal voters, it doesn’t matter that this image isn’t real because it feels real.
The same could be said for this AI-generated image of Kamala Harris.
This picture never happened, of course. But that doesn’t matter. If you believe the narrative that Kamala Harris is a Communist, then it feels like it happened. Neither of these instances of AI slop is any sort of proof that they are real necessary. They’re not trying to convince anyone based on facts and argumentation. They exist solely to exude vibes.
Vibes aren’t anything new to politics. Think about Obama’s famous Hope posters in 2008. And yet, those posters at least used a real photo of Obama. In contrast, both of these images used by the Trump campaign are entirely fake.
AI slop, if you’re unfamiliar, is basically poor-quality, spamming AI-generated art. Take a look at this Twitter account called Insane Facebook AI Slop and you’ll immediately get the idea. For whatever reason, AI slop has found its fit with the far-right. It’s common now to see far-right accounts sharing AI slop, especially in the form of some knock-off Norman Rockwell fictional universe depicting families or other scenes that they find idyllic. An example of this would be a recently viral image of this AI-generated family.
It doesn’t matter that the wife has six fingers or that it’d be impossible to get six kids under the age of 4 to stay still like that. It’s got the right vibes for the people who think they hold to traditional values and that this is the embodiment of those values. Never mind that there is nothing traditional about a computer generating poorly composed fake images.
The simple point I’m trying to make here is that the Trump campaign sees that their base reacts well to the vibes of fake images generated by AI on the internet, so they are increasingly leaning into this tactic in order to exude the right vibes for their base.
But of course, this isn’t limited to Trump and the Right.
As it turns out, Kamala Harris’s campaign has been experimenting with brain rot ads.
If you’re unfamiliar with brain rot content, there are many different varieties of it. This kind features “a type of split-screen video content (primarily posted to TikTok) that shows a main piece of media in the top half and a secondary piece of media in the bottom half meant to catch the viewer's eye and occupy them, akin to a low attention span and stereotypically associated with ADHD” according to Know Your Meme.
The idea is that normal videos of TV shows or people talking or something like that aren’t stimulating enough for someone whose brain has been used to the constant dopamine rush of internet content. So they include something much more stimulating, like videos of someone playing Minecraft or virtual cars jumping off of insane ramps, so that it holds your attention while you more passively consume the top video, which is the “actual” content.
Basically, the Harris campaign understands that the attention span of one of their key demographics has been eroded to the point of virtual nonexistence and they can’t pay attention to Harris’s 10-second soundbite without something else playing to keep them from scrolling.
Of course, it’s not just about trying to hold people’s attention; it’s also about the vibes. This kind of content is native to TikTok, which, overall, skews young and progressive. Whereas the AI slop does far better on Facebook, which skews older and conservative.
It should alarm us that this is the kind of content both presidential campaigns are experimenting with to stir up their bases to vote. They’re moving people to action with things that are false and degrading and that play to the parts of us that have been vaporized by too much time on the internet: our attention and our rationality. These things are the opposite of the true, the good, and the beautiful. These presidential campaigns assume that what will work with us the best are the things that are the worst for us—things that degrade and insult us.
As voters, we should scorn these things. We shouldn’t allow content like this to fall prey to the vibes they make us feel. We should make the candidates work harder for our vote than by creating some stupid AI slop or insulting brain rot. We can’t allow our minds to be co-opted into thinking something false and distracting is normal and good.
Resist the AI slop and the brain rot. Create content that is good, true, and beautiful. Support creators who do. And cultivate a life of the mind that is allergic to any attempt to play to your most base impulses to manipulate you.
Ian is an author, writer, and marketer at Endeavor. Ian has written about faith and technology, deconstruction and reconstruction for The Gospel Coalition and Mere Orthodoxy. He regularly writes on his Substack, Back Again, and is the author of Walking Through Deconstruction: How To Be A Companion In A Crisis Of Faith (IVP 2025). Ian lives in Denton, Texas with his wife, Katie, and sons, Ezra and Alastair, and is a member at The Village Church Denton.
Topics: