Query fan-out and grounding queries
AI engines don't search for the user's exact question. They decompose it into 3-7 sub-queries and search each one. Understanding this is the biggest mental-model shift between traditional SEO and GEO.
What query fan-out actually is
When a user asks ChatGPT "what's the best CRM for a small B2B SaaS team?", ChatGPT does not search for that exact phrase. It silently decomposes the question into multiple sub-queries and searches each one separately. This is called query fan-out, and understanding it is the single biggest mental-model shift between traditional SEO and GEO.
For that example query, ChatGPT might fan out into something like:
- "best CRM small business 2026"
- "CRM B2B SaaS comparison"
- "affordable CRM team under 50 employees"
- "HubSpot vs Pipedrive vs Salesforce small business"
- "CRM features B2B SaaS sales pipeline"
Then it retrieves the top results for each sub-query, synthesizes across them, and produces an answer that cites the pages that appeared most consistently across the highest-relevance sub-queries. Bing exposes these sub-queries explicitly in its AI Performance dashboard — they're called grounding queries. Other engines do the same thing internally but don't show their work.
Why this completely changes content strategy
The traditional SEO playbook says: pick one target keyword, optimize one page for it, win that ranking. GEO turns this inside out. Because the AI engine is searching for 3-7 sub-queries per user question, your page only needs to win one or two of them to get cited. The pages that get cited most are the ones that show up across multiple sub-queries simultaneously.
This means a comprehensive page that covers the topic from five angles will be cited more than five separate thin pages each covering one angle. The thin-page-per-keyword model that worked in SEO actively underperforms in GEO.
The Princeton GEO research framework
Princeton's GEO research (KDD 2024) tested specific content interventions against AI citation rates across multiple engines. The headline findings every operator should know:
- Adding statistics: +22% citation lift
- Adding direct quotations: +37% citation lift
- Adding inline source citations: +30-40% citation lift
- Question-formatted H2 headings + 120-180 word answer blocks: ~40% citation lift
- Adding fluency optimization (clear prose, varied sentence length): +15-20%
These compound. A page that does all five typically sees 2-3x citation rate increases versus the unoptimized baseline. The research was the empirical foundation for the BLUF format and the citation content formula taught in Lesson 2.1.
Reading Bing's grounding queries
Bing's AI Performance dashboard (set up in Lesson 4.4) shows the actual grounding queries the AI used to retrieve content that ended up in answers. This is the only first-party view of fan-out behavior available anywhere. Three patterns to look for:
1. Sub-queries you appear for but didn't write content targeting
This is the gold. If you're getting cited for grounding query "remote SaaS team workflow tools" but you never wrote a page about that, it means your existing content is being retrieved for it incidentally. Build a dedicated page on that topic and you'll likely dominate it.
2. Sub-queries where you appear but get cited rarely
These are pages that the AI engine considers relevant enough to retrieve but not authoritative enough to cite. Three usual causes: thin content, missing schema, or stale dateModified. Refresh and add structure.
3. Sub-queries where you never appear
These are pure content gaps. Either your topical coverage doesn't extend that direction, or the AI engine doesn't yet associate your domain with that sub-topic. Build new pages.
Writing for fan-out
Once you internalize fan-out, content writing changes in specific ways:
- Cover the topic from multiple angles on one page. A "best CRM for small B2B SaaS" page should cover features, pricing, integration, team size considerations, comparison against alternatives, and migration. Each angle is a sub-query the AI might fan out to.
- Use varied phrasings of the core concept. Sub-queries use different language than user queries. "CRM" appears in some, "customer relationship management software" in others, "sales pipeline tool" in yet others. Include all naturally throughout your page.
- Add related-question sections explicitly. A "People also ask" style section toward the bottom of pillar pages catches sub-queries you'd otherwise miss.
- Build topic clusters, not isolated pages. A 12-page cluster on CRM that links internally creates a topical authority signal that helps every page in the cluster get retrieved across more sub-queries.
The fan-out content audit
A 90-minute audit you can run for any pillar topic:
- Pick one pillar topic you want to dominate (e.g. "AI search visibility for SaaS").
- List 15-25 likely sub-queries a user question would fan out to. Write them as you imagine they'd appear in Bing's grounding query report.
- For each sub-query, search it in ChatGPT, Perplexity, and Google AI Overviews. Note which pages get cited.
- Build a coverage matrix. Sub-queries where you appear: green. Sub-queries where you appear but don't get cited: yellow. Sub-queries where you don't appear at all: red.
- Address the red rows first. Either fold the missing angles into existing pages, or build new ones.
Implementation: this week
- Day 1. Pick your single most-important pillar topic. The one that, if you dominated AI citations for it, would meaningfully change your business.
- Day 2. Run the fan-out audit — 90 minutes. Produce the red/yellow/green matrix.
- Day 3-5. Build or expand one existing page to cover the three highest-priority red sub-queries. Use the BLUF format, the claim + statistic + source pattern, and question-formatted H2s.
- Day 7. Re-run the AI citation test for those sub-queries. Some will already show movement; others take 2-6 weeks.
What comes next
Lesson 3.4 covers Wikipedia as the highest-leverage off-page signal. Wikipedia is uniquely powerful for fan-out because AI engines treat it as a trusted seed source — being mentioned on the right Wikipedia article puts you into the citation pool for dozens of sub-queries simultaneously.