← back_to_blog()

Submit URL to Google: Indexing API, IndexNow, and What Actually Works

11 min readRampify Team
submit-url-to-googleindexnowgoogle-indexing-apigoogle-indexingseo-indexing

You published a new page. You want it in Google's index. The question is: how do you tell Google it exists, and how fast will it actually get indexed?

There are three mechanisms for notifying search engines about new or updated URLs: sitemaps (passive), the Google Indexing API (active, limited), and IndexNow (active, broad). Each has trade-offs in speed, reliability, scope, and implementation complexity.

This guide covers all three with honest assessments. No "just submit your sitemap and wait" hand-waving. If you need a page indexed fast, you need to understand what actually works and what's marketing.

The Three Submission Methods#

URL Submission Methods at a Glance

Passive: XML Sitemap
1.Speed to crawlhours to weeks
2.Supported enginesall
3.Rate limit50,000 URLs/file
4.Setup complexitylow
5.Best forfoundation layer
Total time:Reliable but slow
Active: Indexing API + IndexNow
1.Speed to crawlseconds to hours
2.Supported enginesvaries
3.Rate limit200/day (API), unlimited (IndexNow)
4.Setup complexitymedium
5.Best fortime-sensitive content
Total time:Fast but constrained

Sitemaps: The Foundation#

Every site should have a sitemap. It's the baseline that makes everything else work better.

How Sitemaps Work#

An XML sitemap lists your URLs with optional metadata (last modified date, change frequency, priority). Search engines discover your sitemap through robots.txt or direct submission in their webmaster tools.

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://example.com/blog/new-post</loc>
    <lastmod>2026-04-05</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
</urlset>

Dynamic Sitemaps in Next.js#

If you're using Next.js App Router, generate your sitemap dynamically:

// app/sitemap.ts
import { MetadataRoute } from 'next';
 
export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
  // Fetch all published blog posts from your CMS or database
  const posts = await getPublishedPosts();
 
  const blogUrls = posts.map(post => ({
    url: `https://example.com/blog/${post.slug}`,
    lastModified: new Date(post.updatedAt),
    changeFrequency: 'monthly' as const,
    priority: 0.7
  }));
 
  const staticPages = [
    { url: 'https://example.com', lastModified: new Date(), priority: 1.0 },
    { url: 'https://example.com/pricing', lastModified: new Date(), priority: 0.9 },
    { url: 'https://example.com/docs', lastModified: new Date(), priority: 0.8 },
  ];
 
  return [...staticPages, ...blogUrls];
}
lastmod Is the Most Important Field

Google uses lastmod to prioritize which URLs to re-crawl. If you update a page but don't update lastmod, Google may not re-crawl it for weeks. changefreq and priority are largely ignored by Google — focus on accurate lastmod dates.

Sitemap Limitations#

Sitemaps are a hint, not a directive. Google is not obligated to crawl URLs in your sitemap, and having a URL in your sitemap doesn't guarantee indexing. Sitemaps help Google discover URLs more efficiently, but the decision to crawl and index is still based on Google's assessment of quality and crawl budget.

Time to crawl after sitemap update: Minutes to weeks, depending on your site's crawl frequency. High-authority sites see sitemap changes processed within hours. New or low-authority sites may wait days.

Google Indexing API: Fast but Limited#

The Google Indexing API was designed for job posting and livestream content. Google explicitly states this in their documentation. However, many developers use it for general content — with mixed results.

What the Indexing API Does#

You send a URL to Google with a notification type (URL_UPDATED or URL_DELETED), and Google adds it to a priority crawl queue.

import { google } from 'googleapis';
 
const auth = new google.auth.GoogleAuth({
  keyFile: './service-account.json',
  scopes: ['https://www.googleapis.com/auth/indexing']
});
 
const indexing = google.indexing({ version: 'v3', auth });
 
async function notifyGoogle(url: string) {
  const response = await indexing.urlNotifications.publish({
    requestBody: {
      url,
      type: 'URL_UPDATED'
    }
  });
 
  return response.data;
  // { urlNotificationMetadata: { url, latestUpdate: { type, notifyTime } } }
}

Setup Requirements#

Google Indexing API Setup

1
Create a service account
In Google Cloud Console, create a service account under IAM & Admin > Service Accounts. Download the JSON key file. This is different from OAuth2 — no user interaction required.
2
Enable the Indexing API
Under APIs & Services > Library, search for 'Indexing API' and enable it for your project.
3
Add the service account to GSC
Copy the service account email (ends in @...iam.gserviceaccount.com). In Search Console, go to Settings > Users and permissions, and add it as an Owner. Yes, it must be Owner — Contributor isn't sufficient.
4
Submit URL notifications
Use the API to send URL_UPDATED notifications. Google places these URLs in a priority crawl queue, typically processing them within minutes to hours.

The Honest Assessment#

The Indexing API Isn't Magic

Google's documentation states the Indexing API is for pages with JobPosting or BroadcastEvent schema. Using it for other content types works (Google will crawl the URL), but there's no guarantee of faster indexing than sitemaps for non-job-posting content. Some SEOs report significantly faster indexing; others see no difference. Your mileage will vary based on your site's authority and Google's current prioritization.

Rate limits: 200 URL notifications per day. Batch requests can submit up to 100 URLs per call, but the daily limit still applies.

Speed: For job postings, indexing typically happens within minutes. For other content types, it ranges from hours to days — sometimes no faster than a sitemap update.

Risk: Using the API for non-supported content types doesn't violate any TOS, but Google could tighten enforcement at any time. Don't build a critical workflow that depends on this working for blog posts.

IndexNow: The Open Alternative#

IndexNow is an open protocol supported by Bing, Yandex, Seznam.cz, Naver, and Yep. Google does not participate.

How IndexNow Works#

You generate a key, host it as a text file on your site, and send HTTP POST requests to notify engines about URL changes.

// Generate a key (any UUID-like string works)
const INDEXNOW_KEY = 'a1b2c3d4e5f6g7h8';
 
// Host the key file at: https://example.com/a1b2c3d4e5f6g7h8.txt
// The file should contain just the key: a1b2c3d4e5f6g7h8
 
async function submitToIndexNow(urls: string[]) {
  const response = await fetch('https://api.indexnow.org/indexnow', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({
      host: 'example.com',
      key: INDEXNOW_KEY,
      keyLocation: `https://example.com/${INDEXNOW_KEY}.txt`,
      urlList: urls // up to 10,000 URLs per request
    })
  });
 
  return response.status; // 200 = accepted, 202 = accepted (async)
}

IndexNow in Next.js#

Create an API route that fires IndexNow on publish:

// app/api/indexnow/route.ts
import { NextRequest, NextResponse } from 'next/server';
 
const INDEXNOW_KEY = process.env.INDEXNOW_KEY!;
const SITE_HOST = 'example.com';
 
export async function POST(request: NextRequest) {
  const { urls } = await request.json();
 
  if (!urls?.length) {
    return NextResponse.json({ error: 'No URLs provided' }, { status: 400 });
  }
 
  const response = await fetch('https://api.indexnow.org/indexnow', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({
      host: SITE_HOST,
      key: INDEXNOW_KEY,
      keyLocation: `https://${SITE_HOST}/${INDEXNOW_KEY}.txt`,
      urlList: urls
    })
  });
 
  return NextResponse.json({
    status: response.status,
    submitted: urls.length,
    message: response.status === 200 ? 'Accepted' : 'Queued'
  });
}

IndexNow Advantages#

  • No rate limits for practical purposes (10,000 URLs per request)
  • Multi-engine — One submission notifies all participating engines
  • Free — No API costs, no OAuth setup
  • Simple — Just an HTTP POST with a key file for verification

IndexNow Limitations#

  • Google doesn't support it. This is the biggest limitation. If Google is your primary traffic source, IndexNow doesn't help with Google indexing.
  • Bing's market share is ~3% in most markets. IndexNow is valuable for Bing SEO, but the traffic impact is smaller than Google.
  • No feedback — IndexNow returns 200 (accepted) but doesn't tell you if the URL was actually crawled or indexed.

URL Inspection: The Manual Override#

For individual URLs, the URL Inspection tool in Google Search Console remains the most reliable way to request indexing from Google.

  1. Open Search Console for your property
  2. Enter the URL in the inspection bar at the top
  3. Wait for the live test to complete
  4. Click "Request Indexing"

Speed: Typically processed within 24-48 hours. Sometimes faster for high-authority sites.

Limit: Approximately 10 requests per day per property. This is a manual tool, not a programmatic one (though the URL Inspection API exists for checking status — it does not support requesting indexing).

URL Inspection API Cannot Request Indexing

The URL Inspection API lets you check a URL's indexing status, but it does not have an endpoint for requesting indexing. The "Request Indexing" action is only available through the GSC web interface. This is a common misconception.

Honest Speed Comparison#

Based on real-world testing across sites of varying authority:

MethodTime to CrawlTime to IndexGoogle?Bing?
Sitemap (new site)3-14 days7-21 daysYesYes
Sitemap (established site)1-24 hours1-7 daysYesYes
URL Inspection1-48 hours1-7 daysYesNo
Google Indexing API (job posting)5-60 minutes1-24 hoursYesNo
Google Indexing API (other content)1-48 hours1-7 daysYesNo
IndexNow5 seconds-1 hour1-24 hoursNoYes

The takeaway: There's no instant indexing for Google. The fastest reliable path is a combination of an up-to-date sitemap, URL Inspection for priority pages, and patience. For Bing and other engines, IndexNow is dramatically faster.

The Optimal Submission Strategy#

For most sites, the right approach is layered:

Recommended Submission Stack

1
Dynamic sitemap as the foundation
Generate your sitemap programmatically with accurate lastmod dates. Submit it once to GSC and reference it in robots.txt. This handles the 95% case — most pages get indexed through normal crawling guided by sitemaps.
2
IndexNow for Bing and other engines
Fire IndexNow on every publish and significant update. It's free, fast, and covers all non-Google engines with a single API call. Add it to your CMS publish hook or CI/CD pipeline.
3
URL Inspection for priority Google pages
For your most important new content — a launch page, a critical blog post — manually request indexing through URL Inspection. This adds the URL to Google's priority queue.
4
Google Indexing API for job postings
If you publish job listings with JobPosting schema, use the Indexing API. This is the one case where Google reliably fast-tracks indexing through the API.

Automating Submission with Rampify#

Instead of building and maintaining separate integrations for sitemaps, IndexNow, and indexing monitoring, Rampify handles the pipeline:

# In your AI coding tool (Claude Code, Cursor, Windsurf)
 
"I just published a new blog post at /blog/google-indexing-guide. 
 Make sure it's set up for indexing."
 
# AI calls get_page_seo() to verify:
# - Meta tags are present and optimized
# - Canonical URL is correct
# - Page is not accidentally noindexed
# - JSON-LD schema is valid
# - Internal links exist from other pages
# - Page is in the sitemap
 
# AI calls get_issues() to check for problems:
# - Missing meta description
# - Duplicate title tag
# - Broken internal links
# - Missing alt text on images
 
# If issues are found, AI creates a spec with fixes

The key insight: submission is only half the problem. The other half is making sure the page is actually worth indexing before you submit it. A technically perfect submission of a page with missing meta tags, no internal links, and thin content is still going to result in "Crawled - currently not indexed."

Rampify's approach combines submission awareness with content quality checks, so you catch problems before Google does.

For more on monitoring indexing status programmatically, see our GSC API guide. For fixing pages that were crawled but not indexed, see the dedicated troubleshooting guide.

Try Spec-Driven Development with Rampify

Scan your site for SEO issues, pull GSC data into your editor, and create structured specs — all from your AI coding tools. No dashboard tab required.

Get Started Free