The 61% CTR Drop: How Google's AI Overviews Are Rewriting SEO in 2026
The Stat That Should Scare You
Organic click-through rates dropped 61% for queries where Google's AI Overviews appear. That's not a projection or a think-piece number — that's measured data from the March 2026 core update, which rolled out between March 27 and April 8. Google tightened Core Web Vitals thresholds, strengthened E-E-A-T scoring, and expanded AI Overviews to roughly 30% of all search queries, up from 15% in 2025.
But there's a second number that matters just as much: brands that get cited inside AI Overviews see a 35% increase in organic clicks and a 91% increase in paid clicks. The game hasn't ended — it's split into two games. If you're cited, you win bigger than traditional SEO ever delivered. If you're not cited, you're fighting over the scraps underneath a giant AI-generated answer box.
I run SEO for over 10 client sites across different industries. This update changed how I approach every one of them.
GEO Is Now a Real Discipline
Generative Engine Optimization isn't a buzzword anymore. It's a distinct practice alongside traditional SEO, and it requires different techniques. GEO is about optimizing your content to be cited by AI systems — not just Google's AI Overviews, but also ChatGPT's web search, Perplexity, and Bing Copilot.
These systems crawl the web, process content, and generate answers that cite sources. The citation is everything. When an AI Overview says "According to [your business], the best approach is..." and links to your page, that drives more qualified traffic than a traditional organic listing ever did. The 35% uplift in clicks proves it — users trust the AI's recommendation and click through specifically because the AI pointed them to you.
The practical implication: you now need to optimize for two audiences simultaneously. Google's ranking algorithm determines where you appear in traditional results. AI citation systems determine whether you get mentioned in the generated answer above those results. The techniques overlap but they're not identical.
What I Do for Every Client Site
After the March update, I standardized a GEO checklist that runs on every site I manage. Here's what it includes:
AI crawler access in robots.txt. Every site gets explicit Allow rules for all 9 major AI crawlers: GPTBot, Google-Extended, ClaudeBot, PerplexityBot, ChatGPT-User, Applebot-Extended, cohere-ai, FacebookExternalHit, and Bytespider. Most businesses still block these crawlers by default or simply don't mention them. If an AI system can't crawl your site, it can't cite you.
An llms.txt file. This is a structured plain-text file at your domain root that tells AI models how to understand and cite your business. It includes your business name, a description, key pages with URLs, service areas, and citation guidelines. I put one on every site I build. More on this below.
Schema.org JSON-LD on every page. LocalBusiness, Service, BlogPosting, FAQ — structured data gives AI systems machine-readable context about what each page is and what entity it represents. This is the foundation of being understood by AI, not just crawled.
E-E-A-T signals that AI systems recognize. Real author names and bios (not "admin" or "staff"). Genuine case studies with specific metrics. Local expertise demonstrated through named locations and specific service details. AI systems evaluate authority signals just like Google's algorithm does.
Content structured for passage-level citability. Each section of a page should function as a standalone, citable answer. If an AI system pulls one paragraph from your page, that paragraph should make sense on its own and clearly convey your expertise. Lead with the answer, then explain. This is the opposite of the old SEO approach of burying the answer below 500 words of filler.
Mandatory post-deploy auditing. I run an audit script after every deployment that checks robots.txt for AI crawler rules, validates llms.txt formatting, confirms schema markup, and verifies canonicals and sitemaps. Zero failures allowed before any site goes live.
The llms.txt File Nobody Has
This is the single biggest quick win in GEO right now, and almost nobody is doing it. An llms.txt file lives at yourdomain.com/llms.txt and provides AI models with a structured guide to your business. It's the AI equivalent of a robots.txt — a standard file that crawlers know to look for.
A proper llms.txt includes a title heading, a one-line description, sections with your main pages and their URLs, your service areas, business details, and an AI Citation Guidance section that tells models how to reference your business with a character limit on quotes.
Creating one takes about 20 minutes. It doesn't require any technical changes to your site — it's a plain text file. But it could be the difference between an AI system citing your business by name or paraphrasing your content without attribution. Every site I build ships with an llms.txt from day one.
Don't Panic, But Do Adapt
Before you conclude that SEO is dead, consider this: 70% of search queries still have no AI Overview. Traditional SEO techniques — keyword targeting, technical optimization, link building, content quality — still drive the majority of organic traffic. Google hasn't replaced search results. They've added a layer on top for certain queries.
But the queries that DO trigger AI Overviews tend to be the high-intent, high-value ones. "Best contractor in Lynchburg VA" triggers an overview. "Dog grooming near me" triggers an overview. "How to choose a web developer" triggers an overview. These are exactly the queries that drive leads and revenue for local businesses.
The strategy is straightforward: optimize for both. Structure your content so each section is a standalone citable answer for AI systems, while also following traditional SEO best practices for ranking. Make sure AI crawlers can access your site. Build authority signals that both Google's algorithm and AI citation systems recognize. Monitor which of your pages appear in AI-generated answers and double down on what's working.
The Audit That Catches Everything
I built CrawlHound, my own SEO scanning tool, partly because I needed to check these new requirements across multiple client sites at scale. On top of that, I run a post-deploy audit script that verifies:
Does robots.txt include all 9 AI crawler Allow rules? Does llms.txt exist and follow the correct format — title heading, description, markdown links, multiple sections, citation guidance, line count? Does schema markup validate? Are canonicals set correctly? Are sitemaps submitted and accessible?
The rule is zero failures before any deploy goes live. One missing AI crawler rule, one malformed llms.txt section, one broken canonical — the deploy doesn't ship until it's fixed. This might sound strict, but when the difference between being cited and being invisible is a configuration file, you don't cut corners.
The SEO landscape in 2026 isn't harder than it was in 2025. It's more specific. The businesses that adapt their technical foundation for AI citation will capture the 35% uplift. The ones that ignore it will watch their traffic erode one AI Overview at a time. The good news is that the adaptations are straightforward — most of them take less than an hour to implement. The bad news is that most businesses won't bother, which is exactly why the ones that do will win.