Google Search Console API: From API Calls to Actionable SEO Intelligence
The Google Search Console API gives you programmatic access to the same data you see in the GSC dashboard — clicks, impressions, CTR, average position, and indexing status. For developers, this means you can build automated monitoring, regression detection, and content intelligence instead of manually checking a dashboard.
But the API has real limitations that Google's documentation glosses over. This guide covers what the API actually gives you, how to set it up with TypeScript, and how to turn raw data into decisions — including when the API isn't enough and what to use instead.
What the API Gives You#
The GSC API exposes two services:
Search Analytics API#
Query performance data for your site across Google Search. Every row represents a combination of dimensions (query, page, country, device, date) with metrics (clicks, impressions, CTR, position).
interface SearchAnalyticsRow {
keys: string[]; // dimension values
clicks: number; // total clicks
impressions: number; // total impressions
ctr: number; // click-through rate (0.0–1.0)
position: number; // average position (1 = top)
}What you can query:
- Performance by search query ("what keywords bring traffic?")
- Performance by page ("which pages are performing?")
- Performance by country, device, search type
- Date ranges up to 16 months of historical data
- Combined dimensions (query + page, page + country, etc.)
What you can't query:
- Core Web Vitals data (separate API)
- Coverage/indexing report data (UI only)
- Sitemap status (UI only)
- Manual actions (UI only)
- Links report (UI only)
URL Inspection API#
Inspect individual URLs to check their indexing status, crawl details, and mobile usability.
interface URLInspectionResult {
inspectionResult: {
indexStatusResult: {
verdict: 'PASS' | 'PARTIAL' | 'FAIL' | 'NEUTRAL';
coverageState: string; // "Submitted and indexed", etc.
robotsTxtState: 'ALLOWED' | 'BLOCKED';
indexingState: 'INDEXING_ALLOWED' | 'BLOCKED_BY_META_TAG' | 'BLOCKED_BY_HTTP_HEADER';
lastCrawlTime: string;
pageFetchState: 'SUCCESSFUL' | 'SOFT_404' | 'BLOCKED_ROBOTS_TXT';
googleCanonical: string;
userCanonical: string;
};
mobileUsabilityResult: {
verdict: 'PASS' | 'FAIL';
issues: Array<{ issueType: string; severity: string; message: string }>;
};
};
}URL Inspection is the only API that tells you whether a specific page is indexed and why. Use it for automated monitoring of critical pages. If a page's verdict changes from PASS to FAIL, you know something broke — and the response tells you what.
API Limitations You Need to Know#
Before building anything, understand the constraints:
API Limits at a Glance
Search Analytics API
URL Inspection API
The 2-3 day data lag is the biggest limitation. If you deploy a site change and want to see its impact on search performance, you're waiting at least 48 hours. There's no way around this — Google processes search data in batches.
500 requests per day sounds limiting, but each request returns up to 25,000 rows. If you're strategic about your queries, 500 is plenty for most sites. Sites with 100,000+ pages may need to paginate across days.
URL Inspection at 2,000 per day means you can't inspect every page on a large site daily. Prioritize: monitor your top 100 pages by traffic and any recently published content.
OAuth2 Setup with TypeScript#
The GSC API requires OAuth2 authentication. Here's the complete setup:
Setting Up GSC API Access
The Auth Code#
// lib/gsc-client.ts
import { google } from 'googleapis';
const oauth2Client = new google.auth.OAuth2(
process.env.GOOGLE_CLIENT_ID,
process.env.GOOGLE_CLIENT_SECRET,
process.env.GOOGLE_REDIRECT_URI // e.g., http://localhost:3000/callback
);
// Step 1: Generate the auth URL (send user here)
export function getAuthUrl(): string {
return oauth2Client.generateAuthUrl({
access_type: 'offline', // gets a refresh token
scope: ['https://www.googleapis.com/auth/webmasters.readonly'],
prompt: 'consent' // force consent to always get refresh token
});
}
// Step 2: Exchange the auth code for tokens (in your callback handler)
export async function handleCallback(code: string) {
const { tokens } = await oauth2Client.getToken(code);
oauth2Client.setCredentials(tokens);
// Store tokens.refresh_token securely (database, encrypted env var, etc.)
// You'll use this to authenticate future requests without user interaction
return tokens;
}
// Step 3: Create an authenticated client for API calls
export function getSearchConsole(refreshToken: string) {
oauth2Client.setCredentials({ refresh_token: refreshToken });
return google.searchconsole({ version: 'v1', auth: oauth2Client });
}The refresh token provides long-lived access to the user's GSC data. Store it encrypted in your database, not in plain text environment variables. If compromised, revoke it immediately at myaccount.google.com/permissions.
Practical Use Cases#
1. Weekly Performance Report#
Pull a summary of your site's search performance for the past 7 days vs. the previous 7 days:
import { getSearchConsole } from './lib/gsc-client';
async function weeklyReport(siteUrl: string, refreshToken: string) {
const searchconsole = getSearchConsole(refreshToken);
const today = new Date();
// Account for 3-day data lag
const endDate = new Date(today.setDate(today.getDate() - 3));
const startDate = new Date(endDate);
startDate.setDate(startDate.getDate() - 7);
const prevEndDate = new Date(startDate);
prevEndDate.setDate(prevEndDate.getDate() - 1);
const prevStartDate = new Date(prevEndDate);
prevStartDate.setDate(prevStartDate.getDate() - 7);
const [current, previous] = await Promise.all([
searchconsole.searchanalytics.query({
siteUrl,
requestBody: {
startDate: formatDate(startDate),
endDate: formatDate(endDate),
dimensions: ['page'],
rowLimit: 100,
type: 'web'
}
}),
searchconsole.searchanalytics.query({
siteUrl,
requestBody: {
startDate: formatDate(prevStartDate),
endDate: formatDate(prevEndDate),
dimensions: ['page'],
rowLimit: 100,
type: 'web'
}
})
]);
// Compare and identify changes
const currentPages = new Map(
current.data.rows?.map(r => [r.keys![0], r]) ?? []
);
const previousPages = new Map(
previous.data.rows?.map(r => [r.keys![0], r]) ?? []
);
const regressions = [];
for (const [page, row] of currentPages) {
const prev = previousPages.get(page);
if (prev && row.clicks! < prev.clicks! * 0.8) {
regressions.push({
page,
currentClicks: row.clicks,
previousClicks: prev.clicks,
dropPercent: Math.round((1 - row.clicks! / prev.clicks!) * 100)
});
}
}
return { regressions, totalPages: currentPages.size };
}
function formatDate(d: Date): string {
return d.toISOString().split('T')[0];
}2. Ranking Regression Detection#
Detect when important pages drop in rankings — the search equivalent of an error rate spike:
async function detectRankingDrops(
siteUrl: string,
refreshToken: string,
threshold: number = 3 // positions
) {
const searchconsole = getSearchConsole(refreshToken);
// Compare last 7 days to previous 7 days, by query + page
const [recent, baseline] = await Promise.all([
queryPeriod(searchconsole, siteUrl, 3, 10), // days 3-10 ago
queryPeriod(searchconsole, siteUrl, 10, 17) // days 10-17 ago
]);
const drops = [];
const recentMap = new Map(
recent.data.rows?.map(r => [`${r.keys![0]}|${r.keys![1]}`, r]) ?? []
);
for (const row of baseline.data.rows ?? []) {
const key = `${row.keys![0]}|${row.keys![1]}`;
const current = recentMap.get(key);
if (current && current.position! - row.position! > threshold) {
drops.push({
query: row.keys![0],
page: row.keys![1],
oldPosition: Math.round(row.position! * 10) / 10,
newPosition: Math.round(current.position! * 10) / 10,
drop: Math.round((current.position! - row.position!) * 10) / 10,
impressions: current.impressions
});
}
}
// Sort by impressions (highest impact first)
return drops.sort((a, b) => b.impressions! - a.impressions!);
}
async function queryPeriod(
client: any,
siteUrl: string,
daysAgoStart: number,
daysAgoEnd: number
) {
const end = new Date();
end.setDate(end.getDate() - daysAgoStart);
const start = new Date();
start.setDate(start.getDate() - daysAgoEnd);
return client.searchanalytics.query({
siteUrl,
requestBody: {
startDate: formatDate(start),
endDate: formatDate(end),
dimensions: ['query', 'page'],
rowLimit: 5000,
type: 'web'
}
});
}3. Content Opportunity Finder#
Identify keywords where you're ranking on page 2 (positions 11-20) — the closest opportunities to drive new traffic:
async function findContentOpportunities(
siteUrl: string,
refreshToken: string
) {
const searchconsole = getSearchConsole(refreshToken);
const response = await searchconsole.searchanalytics.query({
siteUrl,
requestBody: {
startDate: formatDate(daysAgo(30)),
endDate: formatDate(daysAgo(3)),
dimensions: ['query', 'page'],
rowLimit: 10000,
dimensionFilterGroups: [{
filters: [{
dimension: 'query',
operator: 'excludes',
expression: 'brand-name' // exclude branded queries
}]
}],
type: 'web'
}
});
const opportunities = (response.data.rows ?? [])
.filter(row =>
row.position! >= 8 && // page 1 bottom or page 2
row.position! <= 20 &&
row.impressions! >= 100 // enough volume to matter
)
.map(row => ({
query: row.keys![0],
page: row.keys![1],
position: Math.round(row.position! * 10) / 10,
impressions: row.impressions,
clicks: row.clicks,
potentialClicks: Math.round(row.impressions! * 0.05) // conservative est.
}))
.sort((a, b) => b.impressions! - a.impressions!);
return opportunities.slice(0, 20);
}
function daysAgo(n: number): Date {
const d = new Date();
d.setDate(d.getDate() - n);
return d;
}4. Indexing Status Monitor#
Check the indexing status of your most important pages:
async function monitorIndexing(
siteUrl: string,
urls: string[],
refreshToken: string
) {
const searchconsole = getSearchConsole(refreshToken);
const results = [];
// URL Inspection is 1 URL per request, rate limited at 2,000/day
for (const url of urls) {
try {
const result = await searchconsole.urlInspection.index.inspect({
requestBody: {
inspectionUrl: url,
siteUrl
}
});
const status = result.data.inspectionResult?.indexStatusResult;
results.push({
url,
indexed: status?.verdict === 'PASS',
coverageState: status?.coverageState,
lastCrawl: status?.lastCrawlTime,
canonical: status?.googleCanonical
});
} catch (error) {
results.push({ url, error: (error as Error).message });
}
// Respect rate limits — small delay between requests
await new Promise(resolve => setTimeout(resolve, 200));
}
return results;
}From Raw Data to Actionable Intelligence#
Raw API data tells you what happened. Intelligence tells you what to do about it.
The gap between data and action is where most developers get stuck. You can pull 25,000 rows of search analytics, but turning that into "update the meta description on /blog/nextjs-tutorial because CTR is 40% below average for its position range" requires interpretation logic.
This is where the API-only approach starts to break down. You need:
- Benchmarks — Is a 2.1% CTR good or bad for position 4.5? (It's below average — position 4-5 typically sees 5-7% CTR.)
- Historical context — Is this week's performance normal or a regression?
- Cross-referencing — A page losing clicks AND dropping in position is different from a page losing clicks at the same position (seasonal vs. ranking issue).
- Site-specific patterns — Blog posts behave differently from product pages. Generic benchmarks can mislead.
Rampify's MCP server handles the OAuth flow, rate limiting, data normalization, and interpretation layer. Instead of building a custom GSC client, you ask your AI "what should I write next based on search data?" and get an answer with specific page paths and recommendations. See the MCP setup guide for details.
Building a GSC Monitoring Pipeline#
If you want to build your own monitoring on the raw API, here's the architecture:
// cron job — runs daily at 6 AM
// 1. Pull Search Analytics for the past 7 days
// 2. Compare against previous 7 days
// 3. Detect regressions (position drops, click drops)
// 4. Check indexing status of top 100 pages
// 5. Alert on issues (Slack, email, etc.)
// Cost: ~110 API calls per day
// - 2 Search Analytics queries (current + previous period)
// - 100 URL Inspection calls (top pages)
// - 8 queries for dimensional breakdowns (device, country, etc.)This works, but it requires maintaining the cron job, the OAuth tokens, the comparison logic, and the alerting pipeline. For a team building an SEO tool, this is core product work. For a developer who wants search visibility into their project, it's overhead.
The alternative is to use a service that handles the pipeline and exposes the results through an interface you already use — like your AI coding tool.
What the API Doesn't Cover#
Several important GSC features have no API equivalent:
- Page Indexing report — The dashboard view of indexed vs. excluded URLs. No API.
- Core Web Vitals — Available through the Chrome UX Report API (separate service), not the GSC API.
- Manual actions — Must be checked in the UI.
- Sitemaps report — Submission status, error counts — UI only.
- Links report — Internal and external links — UI only.
- Rich results status — Which pages have valid structured data — UI only.
This means even with full API access, you still need to check the GSC dashboard for some things. Rampify supplements the API data with crawl-based analysis to fill some of these gaps — checking structured data validity, internal link structure, and technical SEO issues that the API doesn't expose.
Try Spec-Driven Development with Rampify
Scan your site for SEO issues, pull GSC data into your editor, and create structured specs — all from your AI coding tools. No dashboard tab required.
Get Started FreeRelated Reading
Google Indexing: The Developer's Guide to Search Console
The complete guide to Google indexing — data model, failure modes, and monitoring strategy for developers.
Connecting GSC to AI Coding Tools via MCP
Set up the Rampify MCP server and bring Google Search Console data directly into Claude Code, Cursor, and Windsurf.
What is Spec-Driven Development?
The complete guide to spec-driven development: define specs, AI implements, verify with a scan.