At some point in the past year or two, you asked AI for advice.
And it was probably terrible.
Here’s why, and also how to fix it in under 5 minutes.
I’m going to show you what the problem is.
And how to bypass it.
Using a simple process to turn your AI ideas from generic to useful.
TL;DR:
Most online content is poor quality. AI tools are trained on that same material, so their default answers mirror it. If you lack the expertise to spot bad advice, you’ll get plausible-sounding guidance that misleads.
Solution: Constrain the AI with specific rules, then pressure-test their outputs using persona challenges before acting.
I’m using SEO as a practical example, and used Claude and also ChatGPT, but this works for any topic.
The problem: Telephone Game of the Uninformed
An often overlooked fact is that the likes of ChatGPT, Claude, Copilot, Perplexity and others are LLMs – Large Language Models.
Meaning?
It’s not quite that simple.
But in a way it is.
You may have already come across this idea, but perhaps sometimes fail to appreciate the significance.
This matters.
Because essentially, all the advice is based on predictably.
And here’s the main point:
AI “answers” may be more accurate if they were based on the opinions of people who knew what they were talking about, but most people sharing SEO views appear to be talking nonsense.
So I asked both ChatGPT and Claude:
“I know this is a very abstract question, but considering all of the SEO original content that’s out there on the web, what percentage do you think might be written by people who genuinely know what they’re talking about, with track records that prove this? In terms of the number of pieces of content, not the number of words. I say original as so many people merely repeat other opinions. Maybe also express this as a ratio of knowledge/experienced basis content about SEO to inexperienced/baseless theories and views.
ChatGPT estimated 10-15% and 1:4.
Both agree that approximately 15% is knowledgeable, and the ratio is approximately 1:4.
So the majority of content written about SEO is inexperienced, baseless and essentially incorrect.
Considering some of the advice that ChatGPT and Claude will use in the example below, it’s safe to assume that they’re not being overly selective when it comes to the quality and reliability of the content they’re using.
In which case, the most likely sources of “words” don’t know what they’re talking about.
And people are using this for advice?
So this article does two things:
-
Explains why generic AI SEO answers lean towards bland and risky.
-
The problem with AI SEO advice (and why it matters)
LLMs predict plausible next words from public texts.
When the underlying SEO content contains a high ratio of recycled, untested opinions, their default answers favour “lowest‑common‑denominator” guidance: generic, safe‑sounding, rarely actionable.
This raises three practical problems:
– Low signal‑to‑noise: much “SEO advice” online isn’t evidence‑based or current, so the model’s baseline answers reflect that.
– Generalities beat specifics: without context or constraints, you get tips that **sound right** but don’t tell you what to do, in what order, or how to measure impact.
– Incentive mismatch: advice that carries risk (e.g., link building) still circulates widely; AI won’t reliably filter the quality and accurate it unless you force it to.
It’s a good idea to treat AI outputs as a starting point, not a plan.
The fix to these issues is to constrain the system to cite reputable sources, modern recommendations to 2026, and then pressure‑test with at least two personas: A cynical owner and a technical SEO, before accepting anything.
With that in mind, I asked ChatGPT for five simple SEO prompts and ran a quick experiment to see what comes back “out of the box,” then how much it improves when challenged.
I added the instruction that they needed to be practical and not too technical.
I then fed each of them through ChatGPT to see what they came out with.
Then for each I gave a score out of four for chances of producing results, accuracy and relevance.
I know this is subjective, but I’ve been doing SEO for almost 30 years, so have a pretty good idea for what works, what makes no difference and what has the potential to cause damage.
What I tested: feeding garbage snacks to your advisor.
My first prompt to ChatGPT:
“Give me five simple SEO prompts that actually work. They need to be practical and not too technical.”
Garbage in, garbage out.
I won’t show you their suggestions, and they made me embarrassed to ask such a dumb thing.
Okay. I will. But it’s embarrassing.
So I changed it:
“I am a small business owner with limited technical skills. I would like to start doing some SEO work, with a view to getting more targeted visitors from Google. Give me a list of five things I can do to improve the amount of traffic from Google.”
This time the results were considerably better.
Summarised versions:
1 – Make sure your website is Google friendly.
2 – Optimise each page around a clear topic.
3 – Create useful content that answers questions.
4 – Improve your local visibility.
5 – Build credibility with links.
This isn’t terrible advice.
But it’s so generic that it says almost nothing.
“Check your site on a phone” is beyond superficial. And it misses the important point of what exactly to look for.
And yes, using PageSpeed Insights is good advice, but says nothing about how to prioritise or fix anything.
Score:
Chance of producing results: 1 (unlikely to move the needle)
Accuracy: 2 (technically correct but useless)
Relevance: 0 (too vague to act on)
Optimising around a clear topic
This is what the collective genius of AI could produce?
The “how to use” section is basic but correct.
The good tip there is “focus on writing for real people”.
That’s good.
Score:
Chance of producing results: 1 (unlikely to move the needle)
Accuracy: 2 (technically correct but useless)
Relevance: 1 (just stating the obvious)
This one is actually reasonable.
It’s basic, but makes sense.
But again, if you think this will result in significant volumes of new traffic from Google, it may be wise to adjust your expectations.
Score:
Chance of producing results: 3 (pretty good and actionable)
Accuracy: 4 (exactly right)
This amused me more than the others.
It’s not terrible.
But it reminds me of the person going to a health specialist who walks away with the advice “be healthier”.
Score:
Chance of producing results: 4 (good and actionable but…)
Accuracy: 3 (technically correct but theoretical)
Relevance: 2 (stating the obvious)
Link building
Here’s the problem with this advice:
In 2026, you can still get “good SEO advice” that will either have zero impact or potentially be bad or even catastrophic.
Score:
Chance of producing results: 0 (don’t do it)
Accuracy: 1 (in theory but move on)
The problem with this approach
There’s a catch with using an AI tool for technical advice, when your skills in these areas are limited.
How do you separate the good advice from the bad?
The person with expertise would be able to do this, but wouldn’t be asking in the first place.
And the person who doesn’t know one end of a canonical from the other, won’t recognise the generic, bad or terrible advice.
In a binary world, this leaves two options. Reject SEO advice from AI or take a deep breath and jump heads-first into it.
But binary choices rule out the thousands of better alternatives that fall between the two.
The solution: constrain & challenge
This is a simple example of how to get a far better answer to your question.
It isn’t exhaustive, researched, clever or technical.
And there are probably an infinite number of ways to get better results.
Here’s the exact prompt that changed everything (copy and paste this):
“I am a small business owner with limited technical skills.
I would like to start doing some SEO work, with a view to getting more targeted visitors from Google.
Give me a list of five things I can do to improve the amount of traffic from Google.
But follow the following rules:
2 – the advice has to be recognised and confirmed as relative in 2026, safe according to Google, with zero risk of penalty or negative action by Google.
3 – any piece of advice has to have solid proof that it actually works. I have limited time and resources, and so have no wish to waste time on tasks that have a very low chance of producing the results I want.
4 – the list of advice needs to be reasonably detailed. Remember that I am not particularly technical. Where possible, cite specific sources with direct links to the source or sources of information.
5 – when you have finished, I want you to run your suggestions through two different personas. The cynical business owner, and after that, the technical SEO.
The cynical business owner will be looking for flaws.
The technical SEO will be looking for bad advice, dangerous advice or pointless advice.
6 – after receiving feedback from both personas, create the list again taking into account their advice. But stick with all of my original requirements.
The results surprised me.
Really surprised me.
You can read the whole thing here.
The initial five steps were reasonable, with sections on what to do, why it works, evidence and practical steps.
My directions were followed, and most of the advice was reasonable.
The persona feedback was where the magic started to happen.
And when the feedback was incorporated into the list of tips, I think the ideas presented went from being vaguely useful to notably solid.
Perfect? No. But for someone with little or no experience of doing SEO for themselves, this would make a good starting point.
It might not say much about the industry, but I often compare SEO content – books, courses and blog posts – to the SEO for Dummies book.
Although much of the content is out-dated, superficial and over-simplified, it does cover the basics quite well.
I believe that most non-technical business owners would be in a better position to make use of SEO if they were to read the book.
And I believe that my slightly wordy and ill thought out prompt rivals what you would gain from reading the 500+ pages of SEO for Dummies.
Bear in mind that my prompt was very non-specific about the business, products, competition and challenges.
A little personalised customisation would produce an even more useful plan.
And generally speaking, even doing a small amount of SEO is better than doing none.
The bottom line? Generic AI prompts give you generic advice. But if you force the AI to cite sources, stick to current best practices, and throw in challenges from at least two sceptical personas, you’ll get something that may be worth your time.
Unique ideas for your business
The Demystifier puts practical ideas into your hands. You won't find them elsewhere. Original, actionable and insanely effective.