Technical SEO Audits
Overview
Technical SEO Audits are systematic evaluations of a website's technical foundation to identify and resolve issues that affect search engine crawling, indexing, ranking, and overall site performance.
What is a Technical SEO Audit?
A Technical SEO Audit is a comprehensive analysis that examines the technical aspects of a website that impact its search engine visibility. Unlike content or link audits, technical audits focus on infrastructure, code quality, site architecture, and performance factors that enable or hinder search engines from discovering, understanding, and ranking your content.
Why Technical SEO Audits Matter
- Uncover Hidden Problems: Identify issues invisible to casual observation
- Improve Search Rankings: Technical issues often cap ranking potential
- Maximize Crawl Efficiency: Ensure search engines can access all important content
- Enhance User Experience: Technical health directly affects UX
- Prevent Revenue Loss: Technical problems can cost thousands in lost conversions
- Maintain Competitive Edge: Stay ahead of competitors' technical implementations
- Support Business Growth: Scale without technical debt
Frequency of Audits
When to Conduct Audits
Comprehensive Audits:
- Before major site launches or migrations
- Quarterly for large, complex sites
- Annually for smaller sites
- After significant traffic drops
- Before major marketing campaigns
Ongoing Monitoring:
- Weekly: Search Console review
- Monthly: Performance metrics check
- Continuous: Automated monitoring alerts
Types of Technical SEO Audits
1. Initial Audit (New Site or First Audit)
Comprehensive baseline assessment covering all technical aspects.
Scope:
- Complete site crawl
- All technical elements
- Historical data review
- Competitor benchmarking
- Detailed documentation
Timeline: 2-4 weeks for thorough analysis
2. Maintenance Audit (Regular Check-ups)
Focused reviews of key metrics and common issues.
Scope:
- Critical issues check
- Performance metrics
- New errors since last audit
- Quick wins identification
Timeline: 1-2 days per audit
3. Migration Audit (Pre/Post Migration)
Specialized audit for site changes.
Pre-Migration:
- Document current state
- Identify redirect mapping
- Backup current data
- Set success metrics
Post-Migration:
- Verify redirects
- Check indexing
- Monitor traffic
- Fix immediate issues
4. Specialized Audits
Focus on specific technical areas.
Examples:
- JavaScript SEO audit
- International SEO audit
- Mobile-specific audit
- Speed optimization audit
- Security audit
Audit Process Framework
Phase 1: Preparation (1-2 Days)
1.1 Gather Access and Credentials
Access Checklist:
- [ ] Google Search Console (Owner or Full access)
- [ ] Google Analytics (Admin or Edit access)
- [ ] CMS backend access
- [ ] FTP/SFTP credentials
- [ ] Server log access
- [ ] CDN dashboard
- [ ] Hosting control panel
- [ ] Development/staging environment
1.2 Gather Historical Data
// Example: Fetch Search Console data
const {google} = require('googleapis');
async function getSearchConsoleData(siteUrl, startDate, endDate) {
const searchconsole = google.searchconsole('v1');
const response = await searchconsole.searchanalytics.query({
siteUrl: siteUrl,
requestBody: {
startDate: startDate,
endDate: endDate,
dimensions: ['page', 'query'],
rowLimit: 25000
}
});
return response.data;
}
1.3 Set Audit Objectives
Define what you're trying to achieve:
- Improve organic traffic by X%
- Reduce crawl errors by Y%
- Improve Core Web Vitals scores
- Prepare for site migration
- Fix indexing issues
Phase 2: Discovery (3-5 Days)
2.1 Site Crawl
# Screaming Frog example
screaming-frog --crawl https://example.com \
--headless \
--output-folder ./audit-results \
--export-tabs "Internal:All,External:All,Response Codes:All" \
--max-crawl-depth 5
Crawl Configuration:
- Set appropriate crawl limits
- Configure user agent
- Enable JavaScript rendering (if needed)
- Set up custom extractions
- Configure API connections
2.2 Performance Analysis
// Lighthouse CI audit
const lighthouse = require('lighthouse');
const puppeteer = require('puppeteer');
async function auditPerformance(url) {
const browser = await puppeteer.launch();
const {lhr} = await lighthouse(url, {
port: new URL(browser.wsEndpoint()).port,
output: 'json',
onlyCategories: ['performance', 'seo', 'accessibility']
});
await browser.close();
return {
performance: lhr.categories.performance.score * 100,
seo: lhr.categories.seo.score * 100,
accessibility: lhr.categories.accessibility.score * 100,
metrics: {
fcp: lhr.audits['first-contentful-paint'].numericValue,
lcp: lhr.audits['largest-contentful-paint'].numericValue,
cls: lhr.audits['cumulative-layout-shift'].numericValue,
tti: lhr.audits['interactive'].numericValue
}
};
}
2.3 Log File Analysis
import pandas as pd
from collections import Counter
def analyze_log_files(log_file_path):
"""Analyze server logs for crawl patterns"""
# Read log file
logs = pd.read_csv(log_file_path, sep=' ')
# Filter for search engine bots
bots = logs[logs['user_agent'].str.contains('Googlebot|Bingbot', na=False)]
# Analyze crawl frequency
crawl_frequency = bots['url'].value_counts()
# Status code distribution
status_codes = Counter(bots['status_code'])
# Response time analysis
avg_response_time = bots['response_time'].mean()
return {
'total_bot_requests': len(bots),
'unique_urls_crawled': bots['url'].nunique(),
'status_codes': dict(status_codes),
'avg_response_time': avg_response_time,
'top_crawled_pages': crawl_frequency.head(20)
}
2.4 Search Console Analysis
// Fetch coverage issues
async function getCoverageIssues(siteUrl) {
const searchconsole = google.searchconsole('v1');
const response = await searchconsole.urlInspection.index.inspect({
requestBody: {
inspectionUrl: siteUrl,
siteUrl: siteUrl
}
});
return response.data;
}
Phase 3: Analysis (3-5 Days)
3.1 Categorize Issues by Severity
Critical (P0) - Fix Immediately:
- Site not indexable
- Major security vulnerabilities
- Complete site inaccessibility
- Widespread 5XX errors
- Penalty risks
High (P1) - Fix Within 1-2 Weeks:
- Duplicate content issues
- Missing canonical tags
- Significant crawl errors
- Poor Core Web Vitals
- Mobile usability failures
Medium (P2) - Fix Within 1 Month:
- Missing meta descriptions
- Image optimization
- Redirect chains
- Minor structured data errors
- Non-critical performance issues
Low (P3) - Ongoing Optimization:
- Content improvements
- Internal linking refinements
- Advanced schema markup
- Minor UX enhancements
3.2 Impact Assessment
// Calculate potential impact
function calculateImpact(issue) {
const factors = {
affectedPages: issue.pageCount,
trafficPotential: issue.avgMonthlySearches,
currentRanking: issue.avgPosition,
competitionLevel: issue.competitionScore,
fixComplexity: issue.developmentHours
};
// Weighted impact score
const impactScore =
(factors.affectedPages * 0.3) +
(factors.trafficPotential * 0.3) +
((100 - factors.currentRanking) * 0.2) +
(factors.competitionLevel * 0.1) +
((10 - factors.fixComplexity) * 0.1);
return impactScore;
}
3.3 Root Cause Analysis
For each major issue, determine:
- What is the technical cause?
- When did it start?
- What's the business impact?
- What systems are affected?
- What's the fix complexity?
Phase 4: Recommendations (2-3 Days)
4.1 Create Action Plan
## Issue: Slow Page Load Times
### Current State
- Average LCP: 4.2s (Target: <2.5s)
- 73% of pages fail Core Web Vitals
- Affecting 12,500 URLs
### Root Causes
1. Unoptimized images (avg 2.5MB per page)
2. Render-blocking CSS (3 external stylesheets)
3. No CDN implementation
4. Server response time: 1.2s
### Recommended Solutions
1. **Image Optimization** (Priority: High, Effort: Medium)
- Implement WebP format
- Add lazy loading
- Compress existing images
- Expected improvement: -40% LCP
2. **CSS Optimization** (Priority: High, Effort: Low)
- Inline critical CSS
- Defer non-critical CSS
- Expected improvement: -25% LCP
3. **CDN Implementation** (Priority: High, Effort: Medium)
- Deploy Cloudflare or similar
- Expected improvement: -20% LCP
4. **Server Optimization** (Priority: Medium, Effort: High)
- Upgrade hosting plan
- Implement caching
- Expected improvement: -15% LCP
### Implementation Plan
- Week 1: CSS optimization (Quick win)
- Week 2-3: Image optimization
- Week 4: CDN setup
- Month 2: Server upgrade evaluation
### Success Metrics
- LCP < 2.5s for 75% of page loads
- 90% of pages pass Core Web Vitals
- 15% increase in organic sessions
- 10% reduction in bounce rate
4.2 Prioritization Matrix
// Prioritize issues using effort vs. impact
function prioritizeIssues(issues) {
return issues
.map(issue => ({
...issue,
priority: calculatePriority(issue.impact, issue.effort)
}))
.sort((a, b) => b.priority - a.priority);
}
function calculatePriority(impact, effort) {
// High impact, low effort = highest priority
// impact: 1-10, effort: 1-10 (10 = most effort)
return (impact * 2) - effort;
}
Phase 5: Reporting (1-2 Days)
5.1 Executive Summary
# Technical SEO Audit: Executive Summary
## Key Findings
- **Overall Health Score**: 72/100 (Industry Average: 65)
- **Critical Issues**: 3
- **High Priority Issues**: 12
- **Estimated Traffic Impact**: +35% within 6 months
## Top 3 Issues
1. **Core Web Vitals Failure** - Affecting 73% of pages
2. **Duplicate Content** - 2,450 URLs with duplicates
3. **Broken Internal Links** - 850 404 errors
## Recommended Actions
1. Immediate: Fix critical indexing blocks
2. Week 1-2: Optimize Core Web Vitals
3. Month 1: Resolve duplicate content
4. Ongoing: Performance monitoring
## Expected Outcomes
- 90% improvement in Core Web Vitals compliance
- 35% increase in indexed pages
- 25% reduction in crawl errors
- 15-20% increase in organic traffic
5.2 Detailed Report Structure
# Technical SEO Audit Report
## Table of Contents
1. Executive Summary
2. Methodology
3. Current State Analysis
4. Issue Inventory
5. Prioritized Recommendations
6. Implementation Roadmap
7. Monitoring & Maintenance Plan
8. Appendices
## 1. Executive Summary
[High-level overview]
## 2. Methodology
- Tools used
- Scope of audit
- Data collection period
- Limitations
## 3. Current State Analysis
### 3.1 Crawlability & Indexability
- Indexed pages: X
- Crawl errors: Y
- Blocked resources: Z
### 3.2 Site Architecture
- URL structure assessment
- Internal linking analysis
- Site depth evaluation
### 3.3 Performance
- Core Web Vitals scores
- Page speed metrics
- Mobile performance
### 3.4 Technical Infrastructure
- Server configuration
- Security implementation
- Mobile optimization
## 4. Issue Inventory
[Detailed list of all issues]
## 5. Prioritized Recommendations
[Action items by priority]
## 6. Implementation Roadmap
[Timeline and resource allocation]
## 7. Monitoring & Maintenance Plan
[Ongoing tasks and KPIs]
## 8. Appendices
- Raw data exports
- Screenshots
- Code examples
Phase 6: Implementation (Ongoing)
6.1 Quick Wins (Week 1)
<!-- Example: Fix missing meta descriptions -->
<script>
// Bulk generate meta descriptions
const pages = document.querySelectorAll('[data-page-id]');
pages.forEach(page => {
if (!page.querySelector('meta[name="description"]')) {
const meta = document.createElement('meta');
meta.name = 'description';
meta.content = generateDescription(page.textContent);
page.querySelector('head').appendChild(meta);
}
});
</script>
6.2 Sprint Planning
## Sprint 1 (2 Weeks): Critical Fixes
- [ ] Fix robots.txt blocking issues
- [ ] Resolve server errors (5XX)
- [ ] Implement canonical tags
- [ ] Fix mobile usability errors
## Sprint 2 (2 Weeks): High Priority
- [ ] Optimize Core Web Vitals
- [ ] Fix redirect chains
- [ ] Implement structured data
- [ ] Resolve duplicate content
## Sprint 3 (2 Weeks): Performance
- [ ] Image optimization
- [ ] JavaScript optimization
- [ ] CDN implementation
- [ ] Caching strategy
Phase 7: Validation (Ongoing)
7.1 Automated Testing
// Continuous monitoring with Lighthouse CI
const fs = require('fs');
const lighthouse = require('lighthouse');
async function validateFixes(urls) {
const results = [];
for (const url of urls) {
const result = await lighthouse(url, {
onlyCategories: ['performance', 'seo'],
});
results.push({
url,
performance: result.lhr.categories.performance.score,
seo: result.lhr.categories.seo.score,
timestamp: new Date()
});
}
// Save results
fs.writeFileSync(
'validation-results.json',
JSON.stringify(results, null, 2)
);
// Check if improvements meet targets
const belowThreshold = results.filter(r =>
r.performance < 0.9 || r.seo < 0.9
);
if (belowThreshold.length > 0) {
console.warn('Some URLs below threshold:', belowThreshold);
}
return results;
}
7.2 Monitoring Dashboard
// Create monitoring dashboard
class SEODashboard {
constructor() {
this.metrics = {
crawlErrors: 0,
indexedPages: 0,
coreWebVitals: {},
organicTraffic: 0
};
}
async updateMetrics() {
this.metrics.crawlErrors = await this.fetchCrawlErrors();
this.metrics.indexedPages = await this.fetchIndexedPages();
this.metrics.coreWebVitals = await this.fetchCWV();
this.metrics.organicTraffic = await this.fetchTraffic();
}
async checkAlerts() {
const alerts = [];
if (this.metrics.crawlErrors > 100) {
alerts.push('High number of crawl errors detected');
}
if (this.metrics.coreWebVitals.lcp > 2.5) {
alerts.push('LCP exceeds threshold');
}
if (alerts.length > 0) {
await this.sendAlerts(alerts);
}
}
}
Common Audit Findings
Finding 1: Indexing Issues
Symptoms:
- Low number of indexed pages
- Important pages not in index
- Search Console coverage errors
Common Causes:
- Noindex tags on important pages
- Robots.txt blocking
- Orphan pages
- Redirect issues
Diagnostic Query:
site:example.com
Fix:
<!-- Remove noindex from important pages -->
<!-- <meta name="robots" content="noindex"> --> ❌
<meta name="robots" content="index, follow"> ✓
Finding 2: Duplicate Content
Detection Methods:
from urllib.parse import urlparse
import hashlib
def detect_duplicates(urls_and_content):
"""Find duplicate content by hash"""
hashes = {}
duplicates = []
for url, content in urls_and_content:
content_hash = hashlib.md5(content.encode()).hexdigest()
if content_hash in hashes:
duplicates.append({
'duplicate': url,
'original': hashes[content_hash]
})
else:
hashes[content_hash] = url
return duplicates
Solutions:
- Canonical tags
- 301 redirects
- URL parameter handling
- Content consolidation
Finding 3: Poor Core Web Vitals
Analysis:
// Analyze CWV failures
function analyzeCWVFailures(pages) {
const failures = {
lcp: pages.filter(p => p.lcp > 2500),
inp: pages.filter(p => p.inp > 200),
cls: pages.filter(p => p.cls > 0.1)
};
// Find common patterns
const patterns = {
lcp: findCommonIssues(failures.lcp, ['images', 'fonts', 'render-blocking']),
inp: findCommonIssues(failures.inp, ['javascript', 'third-party']),
cls: findCommonIssues(failures.cls, ['images', 'ads', 'fonts'])
};
return { failures, patterns };
}
Audit Tools Comparison
| Tool | Type | Best For | Price | Key Features |
|---|---|---|---|---|
| Screaming Frog | Desktop | Detailed crawls | Free/£149 year | Unlimited crawls, custom extraction |
| Sitebulb | Desktop | Visual reports | $35-140/month | Beautiful visualizations |
| Ahrefs | Cloud | All-in-one | $129+/month | Backlinks + technical |
| Semrush | Cloud | Competitor analysis | $139+/month | Comprehensive suite |
| DeepCrawl | Cloud | Enterprise | Custom | Advanced automation |
| Lighthouse | CLI/Browser | Performance | Free | Core Web Vitals |
Audit Deliverables
Minimum Deliverables
- Executive Summary (1-2 pages)
- Issue Inventory (Spreadsheet)
- Prioritized Action Plan
- Implementation Timeline
Comprehensive Deliverables
- Full Audit Report (20-50 pages)
- Issue Database (Airtable/Spreadsheet)
- Visual Reports (Dashboards)
- Technical Documentation
- Training Materials
- Monitoring Setup
- Follow-up Schedule