Module 087 Intermediate 14 min read

Google Search Console Mastery

Setup, URL prefix vs Domain property, the Performance report from Pages to Search Appearance, indexing and URL Inspection, sitemaps, Core Web Vitals, manual actions, AIO/AI Mode reporting, and Search Console Insights.

By SEO Mastery Editorial

Google Search Console is the only first-party data source you have on how Google sees your site. Every other rank tracker, crawl tool, and AI visibility platform is a model of reality; GSC is reality. If you are not living inside it, you are guessing.

TL;DR

  • Use a Domain property, not URL prefix, unless you have a deliberate reason to scope. Domain properties unify www, non-www, http, https, and every subdomain into one report.
  • The Performance report’s eight dimensions are not interchangeable. Pages, Queries, Countries, Devices, Search Appearance, Dates, Filters, and Search Type each unlock a different diagnostic. Learn what each is for or you will misread your data.
  • GSC’s AI Mode and AI Overviews data lives in Search Appearance as of the September 2025 update. Filter by AI Overviews and AI Mode to see the impressions and clicks your content earns inside generative results.

The mental model

Search Console is like a flight recorder on your website. It does not fly the plane and it does not predict turbulence; it captures, with millisecond precision, what actually happened the last time you took off. When something goes wrong, you do not call a pilot — you read the recorder.

The flight-recorder analogy matters because it shapes how you should use the tool. You do not optimize inside GSC. You optimize outside it, then read GSC to confirm the change moved the dial. Every report is a measurement of the past, capped at 16 months, with sampling and privacy thresholds that quietly hide low-volume data. Treat it as evidence, not as a steering wheel.

The other half of the mental model: GSC is the single source of truth that sits between you and Google’s index. When URL Inspection says a page is indexed, it is indexed. When the Indexing report says Crawled - currently not indexed, no third-party tool’s opinion overrides that. Other tools model the index. GSC is the index, from your site’s perspective.

Deep dive: the 2026 reality

Search Console in 2026 looks different from the 2022 version. Five things have changed and you need to know all five.

1. Property types: Domain vs URL prefix. A Domain property verified via DNS TXT covers every subdomain and every protocol. A URL prefix property covers exactly the scheme + host + path you specify. URL prefix is useful when you need to scope reporting to a specific subdirectory (/blog/, /de/) or when DNS verification is blocked. For everything else, Domain is the right answer.

2. The Performance report’s dimensions.

DimensionUse it when
QueriesDiagnosing intent mismatches and discovering accidental rankings
PagesSpot demoted URLs, prioritize updates, find decay
CountriesInternational splits, hreflang sanity checks
DevicesMobile vs desktop CTR gaps, AMP residual data
Search AppearanceFilter to AI Overviews, AI Mode, Videos, Reviews, FAQ, Discussion forum, Merchant listing
DatesYoY comparisons, anomaly detection
Search TypeWeb, Image, Video, News — never aggregate these
FiltersNegative-match queries (e.g., exclude branded)

3. AI Overviews and AI Mode data. Since the September 2025 GSC update, Search Appearance includes AI Overviews (the generative summary inside classic Google) and AI Mode (the dedicated AI Mode tab launched May 2025). Both report impressions and clicks but with their own counting rules: an impression for AI Overviews is logged when your page is cited in the source carousel; an impression for AI Mode is logged when your page is cited in any answer card the user expanded. Click data on both is sparse because most AI answers do not require a click.

4. The Indexing report’s six states. Submitted and indexed, Crawled - currently not indexed, Discovered - currently not indexed, Duplicate without user-selected canonical, Page with redirect, and Excluded by 'noindex' tag. The dangerous ones for production sites are the first two non-indexed states. Crawled - currently not indexed means Google fetched the page and decided it was not worth keeping. Discovered - currently not indexed means Google saw the URL in your sitemap or via internal links but has not yet allocated crawl to it. The fix is different in each case.

5. Page Experience and Core Web Vitals. The Page Experience report was retired in late 2023, but Core Web Vitals (LCP, INP, CLS) remain a separate report and still feed ranking signals. INP replaced FID in March 2024 as the responsiveness metric. The 75th percentile threshold is what you optimize against, not the median.

A few constraints to keep in mind:

  • 16-month data window. Anything older is gone. Export to BigQuery via the Bulk Data Export if you need history beyond that.
  • Anonymized queries. Queries with very low volume are bucketed as (other). Expect 30 to 60 percent of total impressions to live in this bucket on long-tail-heavy sites.
  • Click and impression delays. Today’s data is incomplete for ~48 hours. Do not call wins on day-one data.

Visualizing it

flowchart TD
  A[Verify Domain property via DNS TXT] --> B[Submit XML sitemap]
  B --> C[Index discovers URLs]
  C --> D{Indexing decision}
  D -->|Indexed| E[Eligible for SERP]
  D -->|Crawled not indexed| F[Quality signal too low]
  D -->|Discovered not indexed| G[Crawl budget shortage]
  E --> H[Performance report logs impressions and clicks]
  H --> I[Filter by Search Appearance to isolate AI Overviews / AI Mode]
  F --> J[Improve content or noindex]
  G --> K[Strengthen internal links and reduce thin pages]

Bad vs. expert

The bad approach

Most teams verify a single URL prefix property for https://www.example.com/, then read the Performance report aggregated, with no filters, on the default 3-month range. They watch total clicks tick up or down and call it a report.

Property:  https://www.example.com/   (URL prefix)
View:      Performance > Search results
Filters:   none
Range:     Last 3 months

This misses everything that actually moves a business. You cannot see https://example.com/ traffic if a deploy accidentally drops www. You cannot see if your Reviews rich result lost eligibility. You cannot tell whether AI Mode is sending qualified clicks. You see one number going up or down, attribute it to “Google update” or “we shipped good content,” and move on.

The expert approach

Verify a Domain property via DNS, then build a saved set of filters that match the questions you actually ask of the data.

# DNS TXT record for Domain property verification
host: "@"
type: "TXT"
value: "google-site-verification=AbC123_yourTokenHere"
ttl: 300

Then use the Search Analytics API (or Looker Studio’s GSC connector) to extract data weekly into a long-term store. The API is uncapped at 50,000 rows per request, beats the UI’s 1,000-row export, and respects all filters.

# Pull last 7 days of query-level data with Search Appearance breakdown
from googleapiclient.discovery import build

service = build("searchconsole", "v1", credentials=creds)

request = {
    "startDate": "2026-04-30",
    "endDate": "2026-05-06",
    "dimensions": ["query", "page", "device", "searchAppearance"],
    "rowLimit": 25000,
    "dataState": "final"
}

response = service.searchanalytics().query(
    siteUrl="sc-domain:example.com",
    body=request
).execute()

This works because (a) Domain property aggregates every host, (b) dataState: final excludes the unstable last 48 hours, and (c) including searchAppearance gives you AI Overviews and AI Mode broken out per query and page so you can see exactly which content is being cited inside generative answers.

Do this today

  1. Open Google Search Console > Add property > Domain and add your root domain. Copy the TXT record value and paste it into your DNS provider as a TXT record on @ with TTL 300. Click Verify. If verification fails, wait ten minutes and retry — DNS propagation lags.
  2. In Settings > Users and permissions, add a backup owner. If your only verified owner leaves, you lose ownership rights.
  3. In Sitemaps, submit https://yourdomain.com/sitemap.xml. Watch the Status column flip to Success within 24 hours.
  4. Go to Indexing > Pages. Sort Why pages aren't indexed by count. Investigate the top three reasons. For Crawled - currently not indexed, sample 20 URLs and click URL Inspection to see what Google saw last; if those pages are thin, rewrite or noindex.
  5. Open Performance > Search results > Search Appearance and filter by AI Overviews and AI Mode separately. Note your top 20 cited pages — these are your generative-search winners.
  6. In Experience > Core Web Vitals, click Open report for both Mobile and Desktop. Any URL group flagged Poor is a priority. Cross-reference with PageSpeed Insights field data using CrUX.
  7. Connect GSC to Looker Studio via the official Search Console connector. Build one dashboard with two pages: Site Impression and URL Impression data sources are NOT interchangeable — Site rolls up by query, URL rolls up by page.
  8. In Settings > Bulk data export, link a BigQuery project. Daily exports are uncapped and overcome the 16-month UI limit. The schema includes searchAppearance so AI Overviews and AI Mode are queryable in SQL.
  9. Bookmark the URL Inspection tool and use it any time you push a critical page. Click Test live URL to confirm rendered HTML, then Request Indexing. Do not abuse this — Google rate-limits aggressive requests.
  10. Set up email notifications under Settings > Preferences > Email. Manual actions and Core Web Vitals regressions are silent killers if you are not paged.

Mark complete

Toggle to remember this module as mastered. Saved to your browser only.

More in this part

Part 11: Analytics & Measurement

View all on the home page →
  1. 087 Google Search Console Mastery You're here 14m
  2. 088 Google Analytics 4 for SEO 13m
  3. 089 Google Tag Manager 19m
  4. 090 SEO KPIs & Reporting 12m
  5. 091 Server Log Analysis 26m
  6. 092 Rank Tracking 13m
  7. 093 A/B Testing for SEO 24m