- November 15, 2025
AI is reshaping organic search at a pace most of us have never seen before. New formats, new behaviours, new layers of visibility. Every week brings another acronym, another model, another suggestion that the entire industry is about to be rewritten. It is exciting, but it is also noisy. The temptation is to dive straight in, chase every tactic and try to stay ahead of the curve by reacting to everything.
The problem is that we do not have clear, reliable data. Not yet. We have observations. We have patterns. We have emerging behaviours. What we do not have is a stable, consistent set of performance signals that show exactly how AI Search systems work across categories, query patterns and websites.
So this post is not about hype. It is not a list of quick wins or speculative tactics. It is an attempt to cut through the noise and explain what we can actually see today, why the foundations still matter, and why getting carried away with AI first strategies could put brands at real risk.
Let us start with the biggest truth most people are avoiding.
Right now, no one has a complete dataset on how AI Search decisions are made. Not Google. Not third parties. Not agencies. Not consultants. Everyone is working with fragments. The industry is relying on anecdotal evidence, screenshots, controlled tests and repeated monitoring, not a hard numerical model that tells us how visibility is scored behind the scenes.
This matters, because the absence of clear data often creates the illusion of certainty. People begin to speak with confidence about systems they cannot measure. That is how overreactions happen. That is how advice spreads without the evidence to support it.
This is why stepping back and looking at what we can genuinely observe is more important than ever.
Even without perfect data, there are patterns that keep repeating. They are not definitive, but they are consistent enough to form a view of how early stage AI Search systems behave.
Here is what we can see today.
Most people assume LLMs are reading their entire website in the background. They are not. The majority of your content is not in any training dataset, and even if it were, it would be months or years out of date.
They look at what is already visible in established organic rankings. They surface information from the same pool that users see on page one.
Wayback Machine captures and similar archives can appear in citations, but they cannot be relied upon. They are often years outdated, sometimes missing critical updates and rarely represent the current state of a website.
From repeated testing across industries, the pattern is clear. AI Overviews and similar features rarely reach beyond the top group of results on page one. Sometimes it is the top three. Sometimes the top six. The exact cut off is unclear, but the cluster effect is consistent.
These behaviours matter, because they point to one important conclusion.
AI Search visibility is currently downstream from organic visibility.
This is the part that most brands, agencies and consultants are not thinking about.
If you restructure, rewrite or heavily experiment with your key pages and that work causes your organic visibility to dip, you do not only lose ranking positions. You risk losing visibility in AI Search entirely.
If you drop off page one, the AI systems stop seeing you. And when they stop seeing you, they stop using you as a source.
You become invisible in two systems at the same time.
This is not fear. It is common sense. If the systems use search engine results as a primary source of truth, then falling out of those results makes you disappear. No amount of clever AI tactics can make up for that.
This is why fast, aggressive site restructures right now carry far more risk than they did a year ago.
We are currently in the noisiest period the search industry has ever known. Suddenly, every week brings new playbooks for GEO, SXO, AEO, LLM optimisation, AI first content strategies and prediction models for what the future of search might look like.
Some of the advice is helpful. A lot of it is experimental. A worrying amount is built on assumptions rather than evidence.
The real issue is that the majority of these tactics ignore volatility. They assume that a brand can implement sweeping changes without the risk of losing its grounding in traditional search results.
That is not how the current environment works.
The brands that lose their stable presence on page one are the same brands that lose their influence in AI driven results. In a landscape where AI systems are effectively amplifiers of the top performers, volatility becomes a bigger enemy than ever before.
Because we do not have clean data, the only sensible move is to track what we can see.
That means focusing on:
• frequency of AI Overview appearances
• patterns across related query fans
• stability of outputs over time
• consistency across brand queries and non-brand queries
• which competitors appear and how often
• where newer or weaker sites get inserted into the mix
Measurement gives context. It shows what is stable and what is unstable. It shows where the risks lie. It shows which areas of visibility still behave like classic organic search and which areas are drifting into new patterns.
Most importantly, measurement keeps you honest. It prevents overreaction. It stops teams from making changes based on assumptions. It encourages decisions rooted in observed behaviour, not hype.
This is the part many people do not want to hear, but it is the truth.
The foundations of SEO still decide whether a website is visible in AI Search.
Crawlability.
Readability.
Technical stability.
Content depth.
Clear intent coverage.
Digital PR that builds authority and relevance.
These are not old ideas. They are the bedrock that AI systems lean on to decide what information is reliable. Without these foundations, nothing else matters.
Optimising for AI Search is not a replacement for traditional SEO. It sits on top of it. If the base is weak, the layer above collapses.
Here are the actions that genuinely help, without risking unnecessary volatility.
• Protect your key pages and keep them stable
• Avoid major restructures unless the risk is fully understood
• Strengthen internal linking within your topical clusters
• Improve content clarity and reduce ambiguity
• Support your strongest pages with more consistent Digital PR
• Track AI Overview appearances, but do not chase every fluctuation
• Build depth and trust signals into your content rather than chasing shortcuts
The brands that approach AI Search calmly, analytically and with stability will be the ones who dominate long term.
AI Search is evolving fast, but the fundamentals of visibility have not changed. Until we have reliable data, the safest and smartest approach is to keep organic strong, stable and technically sound. That is where AI systems look for guidance. That is where authority is still built. And that is where brands either hold their position or lose it entirely.
When the landscape moves this quickly, stability is not a weakness. It is the only advantage that lasts.