Introduction

Position one does not protect you when AI Overviews appear. Independent studies report a large click loss on some queries when an AI panel takes the top of the page. You need a measurement plan that shows where you lose clicks, where you gain visibility inside answers, and how those changes affect pipeline. In this guide you learn a practical framework for AI SEO analytics. You set up tracking for assistant referrals in GA4, you measure citations and mentions, you monitor AI Overviews inclusion by keyword, and you connect the dots to revenue with simple models. This matters because leaders now ask for proof. You must show how visibility in assistants turns into branded search lift, engaged sessions, and opportunities. You also need an EU first setup that respects consent and still gives you useful data. You get templates, regex packs, and schema examples you can use today.

What is AI SEO analytics

AI SEO analytics measures how AI search experiences change your visibility and traffic. It covers four areas.

  1. Visibility inside AI answers and overviews
  2. Assistant referral traffic and quality
  3. Citations and mentions share of voice
  4. Business impact from those exposures

You use it to answer three questions. Are we included in answers for our topics. Do we get visits from assistants and how do those visits perform. Does exposure raise branded demand and revenue.

You will link to the related hubs as you read. When you want the deep dive on overviews, see AI Overviews Analytics. When you want packaging tactics for citations, see Answer Engine Optimization. For referral setup, see AI Search Traffic Analytics. For compliance, see EU AI SEO Compliance.

The AISO AI SEO analytics framework

The framework has six parts. Each part ships with a clear outcome and a way to measure progress.

  1. Channel taxonomy in GA4 for AI referrals
  2. LLM watchlist for citations and mentions share of voice
  3. AI Overviews coverage monitor by keyword and topic
  4. Attribution model that joins exposure to pipeline
  5. Entity and schema upgrade for source readiness
  6. EU first data governance with Consent Mode v2

You can implement the whole set in ninety days. You can also phase it by impact and dependency.

Track assistant referrals in GA4

You want a clean view of sessions and conversions from assistants like ChatGPT, Gemini, Perplexity and Copilot. Create a custom channel group in GA4 and use a shared regex bundle for source and medium.

Source and medium patterns

Add a rule set like this in your channel group. Adapt to your naming plan.

Source contains chat.openai OR gpt OR oai
Source contains perplexity.ai OR pypi.perplexity
Source contains gemini.google OR gemini
Source contains bing.com AND contains copilot
Source contains duckduckgo.ai OR ddg-ai
Source contains poe.com
Medium matches_regex ^(referral|social|share|assistant)$

If your team shares links from assistants, use UTM standards so GA4 groups the traffic. Keep it short and human readable.

utm_source=chatgpt  utm_medium=assistant  utm_campaign=topic-keyword
utm_source=perplexity  utm_medium=assistant  utm_campaign=topic-keyword
utm_source=gemini  utm_medium=assistant  utm_campaign=topic-keyword
utm_source=copilot  utm_medium=assistant  utm_campaign=topic-keyword

Build a Looker Studio view that tracks sessions, engaged sessions, conversion rate, and assisted conversions for the AI channel. Segment by assistant. Watch the mix by country and device. You will link this dashboard from AI Search Traffic Analytics.

Measure citations, mentions and share of voice

A citation is a linked source in an answer. A mention is an unlinked brand reference. You want both. Assistants change answers often, so you need a repeatable method.

Prompt battery and logging

Create a list of target prompts. Cover your products, core jobs to be done, and comparison prompts. Run them on a fixed weekly schedule across ChatGPT, Gemini, Perplexity, and Google with AI Overviews on. Log three things for each run.

  1. Did our brand appear
  2. Was it linked or unlinked
  3. Which page did the answer pull

Store the log in a sheet or a small database. Compute three scores.

  1. Citation rate across prompts
  2. Mention rate across prompts
  3. Page level inclusion count

This gives you a simple share of voice metric you can trend over time. You will link the method from AI Citation Tracking. For background on the value of citations and mentions, see the Conductor overview at https://www.conductor.com/academy/ai-search/.

Monitor AI Overviews and AI mode coverage

Use a rank tracking list that mirrors the prompt battery. For each keyword, capture three states. No overview. Overview present without your brand. Overview present with your brand. Track weekly. You now see where you lose clicks and where you must push for inclusion.

Win back plan

When you see an overview without your brand, do three things.

  1. Add a direct answer at the top of the target page
  2. Add FAQ, HowTo and Speakable schema with tight language
  3. Link supporting sources and show examples that prove expertise

Google and Bing want a page they can quote with confidence. Give them a clean answer, strong structure and clear signals. For evidence on click loss, see the Ahrefs study at https://ahrefs.com/blog/ai-overviews-reduce-clicks/. For ongoing prevalence data, see the Semrush study at https://www.semrush.com/blog/semrush-ai-overviews-study/. Deep tactics live in AI Overviews Analytics.

Structure content for answers and citations

You want pages that answer the question in one screen and also support depth. Use this pattern.

  1. Write a one paragraph BLUF that answers the query
  2. Add a numbered list of steps or a short table
  3. Add a worked example with data or screenshots
  4. Add a concise FAQ block on that page when it helps
  5. Support with internal links to deep pages and sources

Use schema to make the structure clear. Focus on FAQPage, HowTo and Speakable. Keep the text on the page identical to the structured data. For a clear view of answer engine optimization, read the Ahrefs guide at https://ahrefs.com/blog/answer-engine-optimization/. For markup guidance, see the Google docs at https://developers.google.com/search/docs/appearance/structured-data. A deeper build lives in Answer Engine Optimization.

Build your entity graph

Assistants reward clear entities. Map your products, problems, audiences, and locations. Use a simple table that lists the entity, the preferred label, known aliases, and the page that defines it. Link entities together with short definition pages. Use internal links to show relationships. This improves citation readiness and reduces ambiguity in answers.

Detect AI crawlers and clean your analytics

Server logs reveal bot traffic that pollutes GA4. Create a list of known agents. Tag them in your logs. Filter them from reports. You can keep a list in code and update it monthly.

# simple example pattern hints
ai_agents = [
  "ChatGPT-User",
  "GPTBot",
  "Google-Extended",
  "CCBot",
  "PerplexityBot",
  "Claude-Web",
  "anthropic-ai",
  "FacebookBot"
]

Push logs to BigQuery. Use a scheduled query that flags these agents and removes them from session tables. Watch for spikes with zero engagement and investigate. The deep guide will live in AI Crawler Analytics.

Attribution that ties AI exposure to revenue

You want a simple plan that your team can maintain. Use three layers.

  1. Branded search lift after exposure
  2. Assisted conversions in GA4 for the AI channel
  3. CRM stitching by landing page and content theme

Branded lift

Track weekly branded clicks in Search Console for topics with rising inclusion in answers. Compare the trend to a control set with no change. If you see a step change that aligns with new citations, record that as exposure value. Do not overfit. Use a short window and move on.

Assisted conversions

In GA4, build a report that shows conversions with the AI channel in the path. Use lookback windows that make sense for your sales cycle. Pair this with session quality metrics like engaged sessions per user and time to conversion.

CRM stitching

Tag content by theme. Send the theme to your CRM on form submit. Create a simple model that attributes influence to themes that gained AI exposure in the same period. Document rules and keep them stable. For a strong frame on the problem of attribution, see Search Engine Land at https://searchengineland.com/seo-attribution-why-its-broken-what-you-can-do-456776. A deeper view lives in SEO Attribution for AI.

EU first measurement and governance

You operate in the EU or sell to EU users. Respect consent and still get useful signals.

  1. Enable Consent Mode v2 and test in a private window
  2. Use region based settings in your tag platform
  3. Reduce data collection to what you need
  4. Focus on aggregated reports and first party storage
  5. Document your assistant tracking logic and share it with legal

This protects your brand and keeps your analytics stable. More details live in EU AI SEO Compliance.

Dashboard specification you can copy

A single Looker Studio view keeps your team aligned. Build these pages.

  1. AI channel overview with sessions, engaged sessions, conversions, assisted conversions
  2. Assistant split by source with trend and country
  3. AI Overviews coverage by keyword with brand inclusion state
  4. Citations and mentions share of voice by prompt and by assistant
  5. Landing page performance for pages that appear in answers
  6. Branded search lift view with control comparison
  7. CRM influence view by content theme

Keep one time period control and one annotation track for key releases.

Worked examples you can reuse

B2B software

A security vendor wants to appear in assistant answers for the query how to choose a password manager for a team. The team writes a BLUF page with a short checklist and a table that compares criteria. They add FAQ and Speakable. They publish a named case study and link it. They join the prompt to the weekly battery. Two weeks later they see mentions in ChatGPT and a citation in Perplexity. Branded clicks rise in Search Console for three weeks. Sales reports an uptick in demos from content tagged with the same theme.

Ecommerce

A sports retailer targets what size running shoe should I buy. They add a size chart and a three step fit process at the top of the guide. They add HowTo schema. Perplexity starts linking to the guide. The AI channel in GA4 shows high engaged sessions and a strong add to cart rate.

Publisher

A travel site targets best time to visit Lisbon for food. They add a BLUF answer, a month by month table, and a local tips section. The page earns a citation in AI Overviews. The team tracks hotel partner clicks from that page. Revenue aligns with the inclusion window.

You can build more examples for your niche in AI SEO Case Studies.

Implementation roadmap in ninety days

Days 1 to 30

  1. Create GA4 custom channel group for AI referrals and publish your regex pack
  2. Build the Looker Studio overview page and share it
  3. Define the prompt battery and start weekly logging
  4. Tag pages by theme and set up CRM fields
  5. Add BLUF sections and schema to five priority pages

Days 31 to 60

  1. Add AI Overviews coverage tracking
  2. Expand schema and entity pages for your core topics
  3. Push server logs to BigQuery and flag bots
  4. Ship country splits and device splits in dashboards
  5. Document EU consent flows and test with legal

Days 61 to 90

  1. Review trends and select three win back targets
  2. Run content upgrades and request expert quotes
  3. Publish a short study page that reports your share of voice trend
  4. Align sales and content on a quarterly test plan
  5. Lock your definitions and keep them stable for the next quarter

Tools and references

Use vendor neutral tools where you can and mix them with your stack.

  1. GA4 for events and channels
  2. Google Search Console for branded lift
  3. Looker Studio for dashboards
  4. BigQuery for bot filtering and joins
  5. Ahrefs or Semrush for keyword lists and overview detection
  6. A simple notebook or script for the prompt battery logging
  7. Your CMS for schema and internal links

External reading that helps you brief your team. The Ahrefs answer engine guide at https://ahrefs.com/blog/answer-engine-optimization/. The Ahrefs click loss study at https://ahrefs.com/blog/ai-overviews-reduce-clicks/. The Semrush prevalence study at https://www.semrush.com/blog/semrush-ai-overviews-study/. The Search Engine Land attribution article at https://searchengineland.com/seo-attribution-why-its-broken-what-you-can-do-456776.

How AISO Hub can help

You can move faster with a partner that ships the assets with you. We keep this practical. We do the heavy setup and leave you with a process you can run.

  1. AISO Audit one time audit of AI visibility, analytics, schema and content structure. You get a plan with fixes and a dashboard base.
  2. AISO Foundation setup of GA4 channel taxonomy, Looker Studio, prompt battery logging, and consent mode. You get templates and training.
  3. AISO Optimize content and entity upgrades for your priority topics with schema and internal links. We target inclusion in answers with clear pages.
  4. AISO Monitor ongoing tracking of overviews, citations, and assistant referrals with alerts and monthly reports. You get stable metrics and clear next steps.

Talk to us when you want a quote and a plan that fits your team size.

Conclusion

AI search changes how people find your brand. You need to know where you appear in answers, how often assistants send you visitors, and whether that exposure grows revenue. You have a framework that covers referrals, citations, coverage, attribution, entities, and governance. You have a dashboard spec, a prompt battery, and a ninety day plan. Start with the GA4 channel group and weekly logging. Add schema and BLUF to your top pages. Track coverage and pick three win back targets. Share trends with your team. Keep definitions stable for a quarter. This work protects clicks, finds new demand, and proves impact in a way your leaders trust.