Screaming Frog Skill — Technical SEO Crawl Analysis

Local SEO strategy focuses on GBP optimization, citations, reviews, and local content. But every local business also has a website, and technical issues on that website can undermine local rankings. Broken links, redirect chains, missing schema markup, duplicate content, slow page speed — these technical problems create friction for both users and search engines. A local business with a perfect GBP profile but a technically broken website is leaving ranking potential on the table.

The screaming-frog-tool skill connects Claude to Screaming Frog’s crawl data. When the local-seo-audit or local-schema strategy skills need technical site health data, dispatch routes to Screaming Frog.

What This Skill Does

The screaming-frog-tool skill enables Claude to analyze Screaming Frog crawl data: broken links and 404s, redirect chains and loops, missing or duplicate title tags and meta descriptions, schema markup validation, page speed indicators, canonical issues, and site structure analysis.

Prompt: "Run a technical audit on our website using Screaming Frog crawl data.
Focus on issues that could be hurting our local rankings."

Skills fired: dispatch → local-seo-audit + screaming-frog-tool
Data pulled: Full crawl analysis, schema validation, link health

Output:
Technical SEO Audit — [domain]
Pages crawled: 47

Critical issues (fix immediately):
  1. 6 broken internal links (404 responses)
     /services/emergency-repair → 404
     /about/team-member-jane → 404
     [4 more listed]
     Impact: Wasted crawl budget, poor user experience

  2. LocalBusiness schema missing on 3 location pages
     /locations/scottsdale — no schema
     /locations/tempe — no schema
     /locations/mesa — no schema
     Impact: Missing rich result eligibility, reduced local relevance signal

  3. Redirect chain on homepage
     http://domain.com → https://domain.com → https://www.domain.com
     Impact: Diluted link equity, slower page load

Warnings (fix when possible):
  4. 8 pages with duplicate meta descriptions
  5. 12 images missing alt text
  6. 3 pages with thin content (<200 words)

Schema Markup Validation for Local Businesses

Schema markup is where technical SEO and local SEO intersect most directly. LocalBusiness schema tells Google structured information about a business: name, address, phone, hours, service area, accepted payments. The local-schema strategy skill knows what schema to implement. Screaming Frog validates whether it’s actually implemented correctly.

Prompt: "Validate the LocalBusiness schema on all our location pages.
Is the markup complete, correct, and consistent?"

Skills fired: dispatch → local-schema + screaming-frog-tool

Output:
Schema Validation — Location Pages

/locations/scottsdale:
  Schema type: LocalBusiness ✓
  Name: matches GBP ✓
  Address: matches GBP ✓
  Phone: matches GBP ✓
  Hours: MISSING — add openingHoursSpecification
  Service area: MISSING — add areaServed
  Geo coordinates: present ✓

/locations/tempe:
  Schema type: MISSING — no LocalBusiness schema found
  Action: Add complete LocalBusiness markup

/locations/mesa:
  Schema type: LocalBusiness ✓
  Name: matches GBP ✓
  Address: MISMATCH — "123 Main St" vs GBP "123 Main Street"
  Action: Update to match GBP exactly

Broken links waste crawl budget and create dead-end user experiences. Redirect chains dilute link equity and slow page loads. Both are common on local business websites that have been around for years — pages get deleted, URLs change, and nobody updates the internal links.

The screaming-frog-tool skill identifies every broken link and redirect chain on the site, prioritized by the page they appear on and the number of internal links pointing to them. A broken link on the homepage matters more than one on a buried blog post.

Redirect analysis traces the full chain. A single redirect (old URL → new URL) is fine. A chain (old → intermediate → new) wastes crawl budget and adds latency. A loop (A → B → A) is a critical error. Screaming Frog catches all three patterns and the skill surfaces them with fix recommendations.

Site Structure Analysis for Local Content

For multi-location businesses, site structure directly affects local rankings. Each location page needs to be crawlable, internally linked, and structurally distinct. Screaming Frog’s crawl data reveals structural issues: orphaned location pages with no internal links, location pages buried too deep in the site hierarchy, duplicate content across location pages that differ only in the city name.

Prompt: "Analyze our site structure for local SEO issues.
Are all location pages properly linked and accessible?"

Skills fired: dispatch → local-landing-pages + screaming-frog-tool

Output:
Site Structure Analysis — Location Pages

Location pages found: 8
  Properly linked from main nav: 5 of 8
  Orphaned (no internal links): 2 (/locations/gilbert, /locations/chandler)
  Buried (3+ clicks from homepage): 1 (/locations/fountain-hills)

Content similarity:
  /locations/scottsdale vs /locations/tempe: 82% similar
  Action: Differentiate content — add neighborhood-specific information,
  local landmarks, unique service details per location

Internal linking:
  Location pages linking to each other: 0 of 8
  Action: Add "Nearby Locations" section to each page

When to Use Screaming Frog vs LocalSEOData

LocalSEOData covers local search metrics — GBP data, rankings, citations, reviews. It does not crawl websites or analyze technical SEO.

Screaming Frog covers technical website health — broken links, redirects, schema validation, site structure, page-level technical issues. It does not analyze GBP profiles or local search visibility.

Zero overlap. LocalSEOData handles the local search layer. Screaming Frog handles the website technical layer. A complete local SEO audit uses both: LocalSEOData for the local search assessment, Screaming Frog for the technical foundation assessment.

Dispatch routes to Screaming Frog when your prompt references technical issues, crawl data, broken links, redirects, schema validation, or site structure. All other local SEO queries route to LocalSEOData by default.

Connecting Screaming Frog to LocalSEOSkills

  1. Screaming Frog SEO Spider installed with license
  2. Run a crawl and export the data (or use Screaming Frog’s API if available)
  3. In Claude Code: Settings then MCP Servers then Add Server
    • Name: Screaming Frog
    • Configuration per your crawl data export format
  4. Restart Claude Code
  5. Verify connection

Get Started

"Analyze the Screaming Frog crawl data for [domain].
Give me the top 10 technical issues affecting local SEO,
prioritized by impact."

If you see a prioritized list of technical issues from crawl data, the integration is working.

Learn More

To learn what this skill can do for your local SEO workflow, see the skill overview.