Weekly SEO Audit
Automatically crawl your site every week, detect SEO issues, and get a health report delivered without lifting a finger. This routine uses Rampify's crawl_site, get_issues, and get_page_seo tools to run a complete audit on a schedule.
Trigger: Schedule (weekly)
Tools used: crawl_site, get_issues, get_page_seo
Run time: ~2-5 minutes depending on site size
The Prompt#
Copy this prompt into your routine configuration:
Run a weekly SEO audit for my site using Rampify:
1. Crawl the site to discover all pages and check for changes since last week
2. Get all current SEO issues grouped by severity (critical, warning, info)
3. For any page with critical issues, get detailed page SEO data
4. Summarize the findings:
- Total pages crawled
- New issues since last run (if apparent)
- Critical issues that need immediate attention
- Health score trend
- Top 3 action items to fix this week
Keep the report concise. Focus on what changed and what needs attention.
Setup#
CLI#
claude routine create \
--name "Weekly SEO Audit" \
--trigger schedule \
--schedule "0 9 * * 1" \
--connector rampify
Then paste the prompt above when prompted for the routine description.
Web or Desktop App#
- Open Claude Code and go to Routines
- Click New routine
- Set the name to "Weekly SEO Audit"
- Paste the prompt above into the description field
- Select Schedule as the trigger
- Set the cron to
0 9 * * 1(every Monday at 9am) - Under Connectors, add Rampify
- Click Create
0 9 * * 1 runs every Monday at 9:00 AM UTC. Adjust the hour and day to fit your workflow. For example, 0 14 * * 5 runs every Friday at 2:00 PM UTC.
Expected Output#
A typical run produces a report like:
## Weekly SEO Audit — April 14, 2026
**Site:** example.com
**Pages crawled:** 24
**Health score:** 78/100 (up from 74 last week)
### Critical Issues (2)
- /blog/old-post: Missing meta description
- /pricing: Title tag exceeds 60 characters (67 chars)
### Warnings (5)
- 3 pages missing schema markup
- 2 pages with images missing alt text
### Action Items
1. Add meta description to /blog/old-post
2. Shorten pricing page title to under 60 characters
3. Add schema markup to /about, /contact, /features
How It Works#
-
crawl_sitediscovers all pages on your domain via sitemap or navigation links. It checks HTTP status, title tags, meta descriptions, schema markup, and canonical URLs for each page. -
get_issuesanalyzes the crawl results and groups problems by severity. Critical issues (broken pages, missing titles) are separated from warnings (missing schema, long descriptions) and informational items. -
get_page_seofetches detailed data for specific pages that have critical issues, including keyword audit status, current meta tags, and Google Search Console performance if connected.
Variations#
Daily quick check — Change the schedule to 0 9 * * * and simplify the prompt to only report critical issues. Good for high-traffic sites where uptime and basic SEO matter daily.
Post-publish verification — Use the API trigger instead of a schedule. Call the routine after your CMS publishes new content to verify the new pages pass SEO checks before they get indexed.