Think about the last time you did your taxes.
If you used an accountant, they opened up their tax software, ran your numbers through a calculator, pulled data straight off your W2 into a templated form, and handed you back a finished return. You paid them for it. You thanked them. Nobody stood up in the lobby and shouted that it didn't count because they used software.
And yet, that is the exact line we've decided to draw around AI and content.
Today's advice
Bad content is bad content. The tool doesn't make it bad. The person using the tool does. And the bigger problem isn't AI at all. It's that we've spent twenty years training ourselves to accept slop from every direction, and now we're surprised when AI joins the pile.
The term "AI slop" is being thrown around hard right now, and most of the time it's being used to drive a wedge. On one side, the "natural" creators who supposedly do everything by hand. On the other, anyone who touched AI at any point in their process. It's a neat little us-versus-them, and it doesn't hold up the second you look at it.
The tool nobody complains about
Every profession has tools that used to be controversial and are now invisible.
Accountants use tax software. Architects use CAD. Designers use Photoshop. Musicians use production software that corrects pitch and timing. Writers use spellcheck and grammar checkers and autocomplete. Every one of these was loudly controversial when it first arrived. Every one of these got absorbed into the craft, and the argument moved on.
Right now we're in the loud part of that cycle with AI. It'll pass.
Where the real slop comes from
To be fair to the people using the term, they're pointing at something real. It's just not what they think it is.
AI didn't invent slop. We've been swimming in it for years.
TikTok killed the curated, professional look that Instagram spent a decade establishing. The lowest production value won, because the algorithm rewarded volume and speed over craft. Education systems have been redesigned around tests that a machine can grade, not around creativity, knowledge, or actual ability. Academic journals publish papers with no real peer review and no academic integrity behind them. News organisations replaced journalism with tweets and blog posts written by people who are not journalists, run by editors who barely edit, on websites that exist to serve the next click.
That is slop. And none of it required AI.
The real problem isn't that AI can produce slop. It's that we have spent two decades systematically rewarding the lowest common denominator across every field where standards used to mean something. AI is just the latest tool to enter that environment. Of course it's getting used to make more slop. The market for slop has never been bigger.
The standards problem
Now I'm going to head off the obvious response. I am not a media company. I am not running a newspaper. I am one person writing a newsletter. So I am not arguing that everyone needs an editorial board.
What I am arguing is that the standards have to live somewhere. If the platforms aren't enforcing them, and the institutions aren't enforcing them, and the audience isn't demanding them, then the only place left is you.
If you publish anything to your business audience, you are the editorial board. You are the standards. The decision about whether the work is good enough to put your name on lives with you. AI doesn't change that. It just makes the decision more important, because the volume of stuff being produced has gone up by orders of magnitude.
The test that matters
The question that matters when you look at any piece of content is: does it help the person reading it?
Not "was this written by a human?" Not "was this touched by AI?" Does it help.
If yes, the tool doesn't matter. If no, the tool doesn't matter either. It was bad before you asked the question.
Judging content by how it was made is like judging a meal by whether the chef used a stove or a grill. Who cares how it was made. The question is whether the meal is any good.
Why this matters
You're getting distracted by the wrong debate.
If you're a small business owner trying to figure out how AI fits into your work, the last thing you need is to be made to feel guilty about using a tool that can help you. That guilt is being manufactured, and it's being manufactured mostly by people who either haven't caught up yet, or who see their own positioning threatened.
But the bigger trap is the other side. If you decide the standards don't matter because the algorithm doesn't reward them, you become part of the slop pile. And once your business is in that pile, you compete on price and volume, not on quality. That race ends one way for a small business, and it isn't well.
The opportunity sitting in front of you right now is to be one of the few people in your space who still cares about the standard. AI gives you the leverage to produce more without dropping it. That combination, more output and the same care, is rarer than it sounds, and it's exactly what your customers can feel even when they can't articulate it.
In the first post of this series I talked about the five types of people when it comes to AI. This piece connects to it. The reason some people are so invested in the "AI slop" label is because admitting the tool is useful means admitting they need to learn something new. It's easier to build a wall. Don't build the wall. Don't lower your standard either.
Here's how to start
If you're a business owner, stop treating AI use as a moral question and start treating it as a craft question. The people using AI badly are producing bad work. The people using it well are producing better work than they used to. The tool is the same. The standard is the difference.
If you're making content yourself, remember that you're still the one with your name on it. AI didn't write your post. You wrote your post using AI. That distinction keeps the responsibility where it belongs. On you.
And if you're judging other people's content, apply the only test that has ever mattered. Did it help you? Did it teach you something? Did it make you think? If yes, who cares what tool they used. If no, who cares what tool they used.
Bad content is bad content. Good content is good content. The rest is noise.
If you've been holding back on AI because you didn't want to end up in the slop pile, hit reply and tell me what you've been hesitant to try. I'll tell you straight whether the concern is real or whether it's the noise getting in your way.
Best
Jono


