The future of search, practical actions, and a reality-check on “Is SEO dead?”
Over the past year I’ve found myself opening ChatGPT far more often than Google when a question is painfully specific: how to interpret a tricky rule in a niche board game, how to update my energy-bill login, why my smart thermostat keeps dropping Wi-Fi. Five minutes of trawling forum threads has become thirty seconds of asking the model – and, more often than not, getting a concise answer and a relevant link I can trust. That shift in my own search habit is exactly why businesses need to understand where large language models are taking us.
1. Why large language models matter for search
Google’s AI Overviews and Microsoft’s Bing Generative Search are no longer experiments – they now appear on one in eight Google queries and reshape Bing’s results into magazine-style answers.
LLMs summarise, link and cite on the results page, meaning:
- fewer clicks for purely informational queries (CTR down 18 % in Semrush’s May study)
- higher surface visibility for authoritative sources the model trusts
- intensified competition for share-of-voice in “zero-click” SERPs
Is search dead?
No. Search is evolving into answer discovery. Traffic is still there – but it flows to brands that feed the model trustworthy signals.
2. Why distrust and opportunity coexist
Concern | Evidence | Opportunity |
---|---|---|
Loss of traffic | EU publishers filed an antitrust complaint claiming AI Overviews “scrape & summarise”. New York Post | Optimise content that earns citation in AI Overviews |
Algorithm opacity | Google’s “weaved” LLM ranking signals remain undisclosed | Send strong E-E-A-T signals: author bios, structured data, references. WPExpertsWritesonic |
Content flood | Cheap AI copy is everywhere | Depth, primary data and expertise stand out in LLM training sets |

3. How LLMs are changing ranking signals
- Context over keywords – LLMs interpret topic clusters; thin single-keyword pages lose visibility.
- Source confidence – models weight publisher history, author profiles and citation count.
- Structured data – schema gives LLMs a clean “hook” to quote.
- Freshness – rapid-train indexes mean new data can surface in weeks. blog.google
- Multimodal answers – images and video snippets are increasingly embedded in AI results. Bing’s roadmap shows images/video in AI cards.Bing Blogs
4. Five actions to stay ahead now
Action | Why it helps |
---|---|
Map content clusters – group pages by intent and interlink | Gives the model clear topical authority |
Expand structured data – FAQ, How-To, Product schema | Increases chance of citation in AI snippets |
Strengthen author bios – real names, credentials, LinkedIn | Feeds E-E-A-T signals the model can verify |
Publish proprietary data – surveys, case studies, first-party numbers | LLMs favour unique facts they can’t get elsewhere |
Monitor zero-click terms – track impressions vs clicks in GSC | Spot where AI answers are stealing CTR and add richer media or interactive tools on page |

5. Looking ahead (with caveats)
- Volatile SERPs: Expect frequent format tweaks while Google refines AI Overviews.
- Attribution blur: AI answers can list multiple sources; brand recognition matters more than raw clicks.
- Regulatory pressure: Micropayments and citation requirements may shift again – plan flexible KPIs.
Industry thought-leaders argue that brands must “optimise for the model, not the query” by building signals the LLM can confidently surface.
6. Quick-start checklist
✅ | Task |
---|---|
Refresh info pages that now trigger AI Overviews with expert depth | |
Add/validate Article, FAQ & Author schema | |
Build an internal topic-graph hub | |
Publish first-party studies or survey data | |
Monitor impression-to-click ratios monthly | |
Produce original imagery/video for future multimodal cards |

7. Final thoughts
Although the way we search is changing – for better in its speed and relevance, for worse in its potential to hide how the sausage is made – one thing is certain: the answers we receive are becoming more refined. I have felt the shift first-hand. The same ChatGPT window that helps me decode an obscure board-game rule now surfaces the exact clause buried in my energy supplier’s terms, complete with a cited link. That efficiency is addictive and it keeps improving at a pace that is hard to match.
LLMs are not killing search. They are upgrading the surface where answers appear. For brands, the playing field now favours clarity, authority and structured knowledge. Put in the hard yards: publish real expertise, wrap it in clean schema, and keep it fresh. Do that and you will ride the wave rather than drown beneath it.
And if steering through the AI surf feels daunting, that’s why I set up Digital Evergreen: contract-free, plain-spoken help to keep your SEO strategy pointing in the right direction. Ready to chat? Drop me a line.