Technical SEO Fix Pipeline
Systematically implement technical SEO fixes from an audit report, verify each fix, and measure ranking impact
npx gtm-skills add drill/technical-seo-fix-pipelineWhat this drill teaches
Technical SEO Fix Pipeline
This drill takes the prioritized issue list from technical-seo-crawl-audit and systematically implements fixes, verifies each fix worked, and requests re-crawling from Google. It processes issues in Impact Score order — highest-value fixes first.
Input
- Audit report JSON from
technical-seo-crawl-audit(with issues sorted by Impact Score) - Access to the site's codebase or CMS for making changes
- Google Search Console access for submitting URL re-indexing requests
Steps
1. Group fixes by implementation type
Categorize each issue from the audit report into fix batches:
- Robots/Crawl directives: robots.txt changes, meta robots tags, canonical fixes → can often be fixed in a single deployment
- Content/Metadata: missing titles, descriptions, H1s, thin content → batch by page template or section
- Infrastructure: redirect chains, broken links, HTTPS issues → requires server/hosting config
- Performance: image optimization, render-blocking resources, unused JS/CSS → requires build pipeline changes
- Structured data: missing or invalid JSON-LD → batch by page type/template
Process batches in this order: Robots/Crawl > Infrastructure > Content/Metadata > Structured Data > Performance. This order maximizes indexation fixes first (no point optimizing pages Google cannot see).
2. Implement robots.txt and crawl directive fixes
Using robots-txt-management:
- If robots.txt is blocking important pages, update the rules to allow them
- If robots.txt is missing a sitemap reference, add it
- Deploy the updated robots.txt
- Verify: re-fetch robots.txt and confirm the changes are live
- Test each previously-blocked URL with the
can_fetch("Googlebot", url)check
Using sitemap-generation:
- If the sitemap is missing pages, regenerate it with all indexable URLs
- Remove any 404 or redirect URLs from the sitemap
- Deploy the updated sitemap
- Submit to GSC using
google-search-console-api
3. Fix redirect chains and broken links
For each redirect chain (3+ hops):
- Identify the final destination URL
- Update the original redirect to point directly to the final destination (single hop)
- Update any internal links that point to the redirect source to point directly to the destination
For each broken internal link (pointing to a 4xx page):
- If the target page should exist: create or restore it
- If the target page was moved: update the link to point to the new URL
- If the target page is genuinely gone: remove the link or replace with a relevant alternative
4. Fix canonical and indexation issues
For each canonical mismatch:
- Determine the correct canonical URL (the version that should rank)
- Update the
<link rel="canonical">tag on the page to self-reference correctly - If duplicate pages exist, set their canonical to the primary version
For each non-indexable page that should be indexed:
- If blocked by
noindexmeta tag: remove thenoindexdirective - If blocked by canonical pointing elsewhere: fix the canonical (see above)
- If blocked by
X-Robots-Tagheader: update the server configuration
5. Fix content and metadata issues
For missing or duplicate page titles:
- Generate unique, keyword-rich titles under 60 characters
- Include the primary target keyword near the beginning
- Format: "{Primary Keyword} — {Brand}" or "{Primary Keyword}: {Value Proposition}"
For missing meta descriptions:
- Generate descriptions under 155 characters
- Include the target keyword and a clear value proposition or call-to-action
- Each description must be unique across the site
For missing or duplicate H1 tags:
- Each page gets exactly one H1 that includes the primary keyword
- H1 should closely match the page title but can be slightly different
For thin content (< 300 words):
- Expand the page with substantive content: add sections, examples, data
- If the page has no ranking potential, consider consolidating it into a related page and redirecting
6. Implement structured data
Using structured-data-validation:
For each page type missing structured data:
- Generate the appropriate JSON-LD markup (Article, FAQPage, BreadcrumbList, SoftwareApplication)
- Inject the
<script type="application/ld+json">tag into the page's<head> - Validate the markup using the structured data testing tool
- Verify no validation errors before deploying
7. Fix performance issues
For each page with poor Core Web Vitals:
- Review the specific Lighthouse audit recommendations from the audit report
- Common fixes by category:
- LCP: compress images, preload hero image, reduce server response time, remove render-blocking CSS/JS
- CLS: set explicit width/height on images and embeds, avoid dynamically injecting content above the fold
- INP: reduce JavaScript execution time, break up long tasks, defer non-critical JS
- Implement fixes in the build pipeline or page templates
- Re-test with
pagespeed-insights-apiafter deployment
8. Verify fixes and request re-indexing
After each batch of fixes is deployed:
- Re-run the relevant checks from
technical-seo-crawl-auditon the affected URLs to confirm the issues are resolved - Using
google-search-console-api, submit each fixed URL for re-indexing:POST https://indexing.googleapis.com/v3/urlNotifications:publish {"url": "https://example.com/fixed-page", "type": "URL_UPDATED"} - Log each fix: URL, issue type, what was changed, verification result, re-index request timestamp
9. Measure fix impact
After 2-4 weeks, pull fresh GSC data:
- Compare indexation status: how many previously non-indexed pages are now indexed?
- Compare search analytics: clicks, impressions, CTR, average position for the fixed pages
- Re-run PageSpeed Insights on performance-fixed pages: compare scores before vs. after
- Document the impact per fix category in the audit trail
Output
- Fix log: every change made, which URL, what was changed, verification status
- Re-indexing request log: every URL submitted, timestamp
- Before/after comparison: metrics for each fixed page
- Remaining issues: any issues that could not be fixed (e.g., require human intervention or product changes)
Triggers
- Run after each
technical-seo-crawl-auditcompletes - At Smoke level: one-time execution, human reviews each fix before deployment
- At Baseline level: weekly execution, agent implements low/medium fixes autonomously, flags high/critical for human review
- At Scalable level: continuous execution, agent implements all fixes autonomously with revert capability