Claude-seo-skills seo-gsc-cannibalization
git clone https://github.com/lionkiii/claude-seo-skills
T=$(mktemp -d) && git clone --depth=1 https://github.com/lionkiii/claude-seo-skills "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/seo-gsc-cannibalization" ~/.claude/skills/lionkiii-claude-seo-skills-seo-gsc-cannibalization && rm -rf "$T"
skills/seo-gsc-cannibalization/SKILL.mdGSC Cannibalization — Keyword Cannibalization Detection
@skills/seo/references/mcp-degradation.md @skills/seo/references/gsc-api-reference.md
Finds queries where multiple pages compete for the same ranking, causing them to cannibalize each other's traffic. Identifies the "winning" page and "losing" pages for each cannibalized keyword.
MCP Check
Before calling any GSC tool, verify the MCP is connected:
- Use ToolSearch with query
+google-search-console - If tools returned — note the actual tool name prefix, proceed to Inputs
- If no tools returned — display the GSC MCP error template from
and stop:references/mcp-degradation.md
## Google Search Console MCP Not Available The `/seo gsc cannibalization` command requires the GSC MCP, which is not currently connected. **What you can do:** - Use `/seo technical <url>` for crawlability and indexability analysis (no live data) - Use `/seo audit <url>` for a full static SEO audit **To connect GSC MCP:** - Install and configure a Google Search Console MCP server (see README for setup) - Add it to ~/.claude/mcp.json at user scope (NOT project scope) - Verify GSC property access before running commands (domain vs URL prefix format) - See references/gsc-api-reference.md for property format details
Inputs
: The GSC property URL. Accept both formats:site- Domain property:
sc-domain:example.com - URL prefix:
orhttps://example.comhttps://www.example.com - If user provides a bare domain (no prefix), call
to identify the correct property format registered in GSC.list_sites
- Domain property:
Date Calculation
Use Bash to calculate dates (GSC has ~3 day delay):
endDate=$(date -v-3d +%Y-%m-%d) startDate=$(date -v-31d +%Y-%m-%d) echo "endDate: $endDate | startDate: $startDate"
Execution
Step 1 — Pull query+page data: Call
query_search_analytics with:
: the site propertysiteUrl
: calculated startDatestartDate
: calculated endDateendDate
:dimensions["query", "page"]
: 1000rowLimit
Post-processing — detect cannibalization: Group the results by
query. For each query:
- Count distinct
valuespage - If a query has 2 or more distinct pages → it is cannibalized
- Keep only cannibalized queries for the report
For each cannibalized query, identify:
- "Winning" page: the page with the most
for that queryclicks - "Losing" pages: all other pages competing for the same query
Sort cannibalized queries by total impressions descending (highest-traffic cannibalization issues first).
CTR display rule: API returns CTR as decimal — multiply by 100 for display.
Output Format
## GSC Cannibalization Report: [site property] **Period:** [startDate] to [endDate] (28 days) **Cannibalized queries found:** [count] ### Cannibalized Keywords #### Query: "[query]" ([total impressions] impressions) | Role | Page URL | Clicks | Impressions | CTR | Position | |------|----------|--------|-------------|-----|----------| | WINNING | [url] | [n] | [n] | [X.XX%] | [X.X] | | losing | [url] | [n] | [n] | [X.XX%] | [X.X] | **Recommendation:** [recommendation based on situation] [Repeat for each cannibalized query...] ### Summary | Metric | Value | |--------|-------| | Total cannibalized queries | [count] | | Total pages involved | [count] | | Estimated lost clicks | [sum of losing page clicks] |
Recommendation logic:
- If winning page has much higher position (lower number): consolidate losing pages into winning page via redirect or canonical
- If positions are close (within 3): add canonical signals pointing from losing to winning page; consider merging content
- If the pages cover distinct subtopics: differentiate content more clearly or use internal links to signal hierarchy
If no cannibalization found, report: "No keyword cannibalization detected. Each query appears on only one page in the top 1000 results."