AI Slop Is Coming for Our City, Too
By Daisy Thomas Candidate for Mayor, New Port Richey daisyfornpr.com
I work in AI policy. I also live in a small downtown on the Gulf Coast. That combination means I think about what technology actually does to democratic decision-making not in theory, but in the kinds of community conversations where real tradeoffs get made about real places.
So when AI-generated images started showing up in local political conversations, I noticed. And I think more people should.
Florida actually has a law on this. Since 2024, political advertisements that use AI-generated content are required to carry a disclosure. If an image, video, or audio was substantially created by artificial intelligence, candidates and campaigns are supposed to say so. It was one of the first laws of its kind in the country.
But the law covers political advertisements. It does not cover every image shared in a Facebook group, dropped into a community meeting presentation, or posted to a local page by someone who just thinks it looks nice. The disclosure requirement kicks in when there is a formal campaign ad. It has nothing to say about the broader culture of AI-generated imagery seeping into public conversations about what our cities could or should look like.
That gap is what this article is about.
If you spend enough time online, you start to notice something strange. Cities that don’t exist. Downtowns that look perfect but unfamiliar. Streets where every palm tree is identical and every café is full. The buildings are just a little too clean. The sidewalks just a little too lively. The lighting always looks like golden hour.
Welcome to the age of AI slop
“AI slop” is the term people have started using for the flood of cheap, mass-produced images generated by artificial intelligence. These images are designed to look convincing at first glance, but they are not photographs and they are not real places. They are visual shortcuts.
You see it everywhere now. Fake historical photos. Fake wildlife shots. Fake “news” images. Perfect-looking towns that exist only inside a prompt box.
And now it’s starting to show up in local politics.
The reason is simple. AI images are fast, cheap, and emotionally persuasive. You can type a sentence into a generator and get a polished image of a place that looks hopeful, prosperous, and full of life. No photographer needed. No construction plans required. No actual project underway.
Just vibes.
But here’s the problem.
Cities are not vibes.
Cities are infrastructure plans, zoning decisions, drainage systems, business development, environmental protection, and budgets that have to balance. Cities are messy, complicated, and very real. The work of governing them happens in meeting rooms, engineering reports, and long public discussions about tradeoffs.
A glossy image can make it look simple. It never is.
That is why AI slop matters.
When AI images get used in public conversations about real communities, they blur the line between vision and fiction. Residents start reacting to pictures of places that do not exist instead of debating the actual decisions that shape their city. And democracy depends on that distinction staying clear.
Think about what responsible use actually looks like. Architects use visualizations. Urban planners use concept sketches. Designers create renderings to help people imagine possibilities. But those images are labeled as concepts. They are tied to real proposals, real budgets, and real locations. There is a plan behind them. There is accountability attached to them.
AI slop skips all of that. It jumps straight to the emotional response.
Look at this beautiful future. Don’t worry about the details.
Here is a simple test. When you see an image used to make a case about your community, ask: Is there an engineering study? A funding source? An environmental review? A timeline with actual milestones?
If the answer is no, you are probably looking at a prompt, not a plan.
Technology is not the enemy here. AI can be a powerful tool when it is used honestly and transparently. But as these tools become cheaper and faster, communities are going to have to get sharper about recognizing the difference between a vision and a shortcut.
One comes with hard work attached. The other comes from a keyboard.
The future of your city is too important to be decided by whoever generates the most convincing image. It belongs to the people who show up, ask hard questions, and insist on real answers.
That is still how this is supposed to work.
Originally posted on Medium, March 11, 2026.