How to Automate SEO Audits in Django with Python and the SEOPeek API
Django powers everything from content-heavy publications to SaaS dashboards. But most Django teams treat SEO as a one-time checklist instead of a continuous process. Pages change, templates get refactored, CMS content gets updated—and SEO regressions slip through unnoticed. This guide shows you how to wire the SEOPeek API into your Django project so that every page is audited automatically, every regression is caught early, and every score is stored for historical tracking.
1. Calling the SEOPeek API with Python requests
The SEOPeek API is a single REST endpoint. You send a GET request with a url parameter, and it returns a JSON object with a numeric score, a letter grade, and pass/fail results for 20+ on-page SEO checks. No API key is required for the free tier.
Install the requests library if you do not already have it:
pip install requests
Here is the simplest possible audit call in Python:
import requests
SEOPEEK_API = "https://us-central1-todd-agent-prod.cloudfunctions.net/seopeekApi/api/v1/audit"
def audit_url(url: str) -> dict:
"""Run an SEO audit on a single URL and return the results."""
response = requests.get(SEOPEEK_API, params={"url": url}, timeout=30)
response.raise_for_status()
return response.json()
# Example usage
result = audit_url("https://yoursite.com/blog/my-post/")
print(f"Score: {result['score']}/100 ({result['grade']})")
# Inspect individual checks
for check_name, check_data in result.get("checks", {}).items():
status = "PASS" if check_data["pass"] else "FAIL"
print(f" [{status}] {check_name}: {check_data['message']}")
The response JSON looks like this:
{
"url": "https://yoursite.com/blog/my-post/",
"score": 87,
"grade": "B+",
"checks": {
"title": {"pass": true, "message": "Title tag exists and is 45 characters"},
"meta_description": {"pass": true, "message": "Meta description is 152 characters"},
"h1": {"pass": true, "message": "Single H1 tag found"},
"og_tags": {"pass": false, "message": "Missing og:image tag"},
"canonical": {"pass": true, "message": "Canonical URL is set"},
"structured_data": {"pass": true, "message": "JSON-LD schema detected"},
...
}
}
Tip: Wrap the API call in a reusable utility module (e.g., seo/client.py) so that your management commands, middleware, and tests all share the same client code. This also makes it easy to swap in your API key later when you upgrade to a paid plan.
2. Django Management Command for Bulk SEO Auditing
The most practical way to run SEO audits in Django is with a custom management command. You can run it manually, schedule it with cron, or trigger it from your CI/CD pipeline. This command accepts a list of URLs (or reads them from a file) and audits each one.
Create the file at yourapp/management/commands/seo_audit.py:
import time
import json
import requests
from django.core.management.base import BaseCommand, CommandError
SEOPEEK_API = (
"https://us-central1-todd-agent-prod.cloudfunctions.net"
"/seopeekApi/api/v1/audit"
)
class Command(BaseCommand):
help = "Run SEO audits on a list of URLs using the SEOPeek API"
def add_arguments(self, parser):
parser.add_argument(
"urls",
nargs="*",
help="URLs to audit (space-separated)",
)
parser.add_argument(
"--file", "-f",
type=str,
help="Path to a file containing URLs (one per line)",
)
parser.add_argument(
"--min-score",
type=int,
default=70,
help="Minimum acceptable SEO score (default: 70)",
)
parser.add_argument(
"--output",
type=str,
help="Write JSON results to this file",
)
parser.add_argument(
"--delay",
type=float,
default=1.0,
help="Seconds to wait between requests (default: 1.0)",
)
def handle(self, *args, **options):
urls = list(options["urls"])
# Load URLs from file if provided
if options["file"]:
try:
with open(options["file"]) as f:
urls.extend(
line.strip()
for line in f
if line.strip() and not line.startswith("#")
)
except FileNotFoundError:
raise CommandError(f"File not found: {options['file']}")
if not urls:
raise CommandError(
"Provide URLs as arguments or with --file"
)
min_score = options["min_score"]
results = []
failures = []
self.stdout.write(
f"\nAuditing {len(urls)} URL(s) "
f"(minimum score: {min_score})\n"
)
self.stdout.write("-" * 60)
for i, url in enumerate(urls):
try:
resp = requests.get(
SEOPEEK_API,
params={"url": url},
timeout=30,
)
resp.raise_for_status()
data = resp.json()
score = data.get("score", 0)
grade = data.get("grade", "?")
checks = data.get("checks", {})
failing = [
k for k, v in checks.items()
if not v.get("pass")
]
result = {
"url": url,
"score": score,
"grade": grade,
"failing_checks": failing,
}
results.append(result)
if score < min_score:
failures.append(result)
self.stdout.write(
self.style.ERROR(
f" FAIL {url} — {score}/100 ({grade}) "
f"— {len(failing)} checks failing"
)
)
for check in failing:
msg = checks[check].get("message", "")
self.stdout.write(
self.style.WARNING(
f" - {check}: {msg}"
)
)
else:
self.stdout.write(
self.style.SUCCESS(
f" PASS {url} — {score}/100 ({grade})"
)
)
except requests.RequestException as e:
self.stdout.write(
self.style.ERROR(
f" ERROR {url} — {e}"
)
)
results.append({
"url": url,
"score": None,
"error": str(e),
})
# Rate limiting between requests
if i < len(urls) - 1:
time.sleep(options["delay"])
# Write JSON output if requested
if options["output"]:
with open(options["output"], "w") as f:
json.dump(results, f, indent=2)
self.stdout.write(
f"\nResults written to {options['output']}"
)
# Summary
self.stdout.write(f"\n{'=' * 60}")
self.stdout.write(
f"Audited: {len(urls)} | "
f"Passed: {len(urls) - len(failures)} | "
f"Failed: {len(failures)}"
)
if failures:
raise CommandError(
f"{len(failures)} URL(s) scored below {min_score}"
)
Run it like this:
# Audit specific URLs
python manage.py seo_audit https://yoursite.com/ https://yoursite.com/pricing/
# Audit URLs from a file
python manage.py seo_audit --file urls.txt --min-score 80
# Save results as JSON for CI/CD artifact storage
python manage.py seo_audit --file urls.txt --output seo-results.json
# Use in CI/CD — non-zero exit code on failure
python manage.py seo_audit --file urls.txt --min-score 75 || exit 1
CI/CD integration: Add python manage.py seo_audit --file critical-urls.txt --min-score 75 to your GitHub Actions workflow, GitLab CI pipeline, or Jenkins build step. The command exits with a non-zero code when any URL fails, which automatically breaks the build.
3. Django Middleware for Dev-Mode SEO Checking
During local development, it is useful to see SEO scores as you work on templates. This middleware hits the SEOPeek API on every page render (in development mode only) and injects the score into the response as an HTML comment and a custom header.
Create seo/middleware.py:
import requests
from django.conf import settings
SEOPEEK_API = (
"https://us-central1-todd-agent-prod.cloudfunctions.net"
"/seopeekApi/api/v1/audit"
)
class SEOAuditMiddleware:
"""
Dev-only middleware that audits every HTML response with SEOPeek.
Adds an X-SEO-Score header and injects a comment before </body>.
Only active when DEBUG=True. Never enable this in production.
Add to MIDDLEWARE in settings.py:
'seo.middleware.SEOAuditMiddleware',
"""
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
response = self.get_response(request)
# Only run in debug mode, only for HTML responses
if not settings.DEBUG:
return response
if "text/html" not in response.get("Content-Type", ""):
return response
# Skip admin, static, and media URLs
skip_prefixes = ("/admin/", "/static/", "/media/")
if any(request.path.startswith(p) for p in skip_prefixes):
return response
try:
# Build the full URL for the current page
url = request.build_absolute_uri()
audit = requests.get(
SEOPEEK_API,
params={"url": url},
timeout=10,
)
audit.raise_for_status()
data = audit.json()
score = data.get("score", "?")
grade = data.get("grade", "?")
checks = data.get("checks", {})
failing = [
f"{k}: {v.get('message', '')}"
for k, v in checks.items()
if not v.get("pass")
]
# Add custom header
response["X-SEO-Score"] = f"{score}/100 ({grade})"
# Inject HTML comment before </body>
comment_lines = [
f"SEOPeek Audit: {score}/100 ({grade})",
f"Failing checks: {len(failing)}",
]
for fail in failing:
comment_lines.append(f" - {fail}")
comment_lines.append(
"https://seopeek.web.app — disable by "
"removing SEOAuditMiddleware"
)
comment = (
"\n<!-- \n"
+ "\n".join(comment_lines)
+ "\n-->\n"
)
content = response.content.decode("utf-8")
if "</body>" in content:
content = content.replace(
"</body>", comment + "</body>"
)
response.content = content.encode("utf-8")
response["Content-Length"] = len(response.content)
except Exception:
# Never break the page if the audit fails
pass
return response
Enable it in your settings.py:
MIDDLEWARE = [
"django.middleware.security.SecurityMiddleware",
# ... other middleware ...
"seo.middleware.SEOAuditMiddleware", # Add at the end
]
Now every page you visit in the browser during development will include an X-SEO-Score header (visible in DevTools) and an HTML comment at the bottom of the page showing the score and any failing checks. When you see a score drop, you know immediately that your latest template change broke something.
4. Auditing Every URL in Your Django Sitemap
Most Django sites use the built-in sitemap framework to generate sitemap.xml. That means your site already has a machine-readable list of every URL that matters for SEO. This management command reads your sitemap and audits every URL in it.
Create yourapp/management/commands/seo_audit_sitemap.py:
import time
import xml.etree.ElementTree as ET
import requests
from django.core.management.base import BaseCommand, CommandError
SEOPEEK_API = (
"https://us-central1-todd-agent-prod.cloudfunctions.net"
"/seopeekApi/api/v1/audit"
)
# XML namespace used in sitemap files
SITEMAP_NS = {"ns": "http://www.sitemaps.org/schemas/sitemap/0.9"}
class Command(BaseCommand):
help = "Audit all URLs in your Django sitemap with SEOPeek"
def add_arguments(self, parser):
parser.add_argument(
"sitemap_url",
type=str,
help=(
"URL of your sitemap.xml "
"(e.g., https://yoursite.com/sitemap.xml)"
),
)
parser.add_argument(
"--min-score",
type=int,
default=70,
help="Minimum acceptable SEO score (default: 70)",
)
parser.add_argument(
"--limit",
type=int,
default=0,
help="Max URLs to audit (0 = all, default: 0)",
)
parser.add_argument(
"--delay",
type=float,
default=1.5,
help="Seconds between API calls (default: 1.5)",
)
def handle(self, *args, **options):
sitemap_url = options["sitemap_url"]
# Fetch and parse the sitemap
self.stdout.write(f"Fetching sitemap: {sitemap_url}")
try:
resp = requests.get(sitemap_url, timeout=15)
resp.raise_for_status()
except requests.RequestException as e:
raise CommandError(f"Failed to fetch sitemap: {e}")
root = ET.fromstring(resp.content)
# Handle both sitemap index and regular sitemaps
urls = []
# Check if this is a sitemap index
sitemap_locs = root.findall(
"ns:sitemap/ns:loc", SITEMAP_NS
)
if sitemap_locs:
self.stdout.write(
f"Sitemap index found with "
f"{len(sitemap_locs)} sub-sitemaps"
)
for loc in sitemap_locs:
sub_resp = requests.get(
loc.text.strip(), timeout=15
)
sub_root = ET.fromstring(sub_resp.content)
for url_elem in sub_root.findall(
"ns:url/ns:loc", SITEMAP_NS
):
urls.append(url_elem.text.strip())
else:
for url_elem in root.findall(
"ns:url/ns:loc", SITEMAP_NS
):
urls.append(url_elem.text.strip())
if not urls:
raise CommandError("No URLs found in sitemap")
# Apply limit if set
limit = options["limit"]
if limit > 0:
urls = urls[:limit]
self.stdout.write(
f"Found {len(urls)} URLs. Starting audit...\n"
)
min_score = options["min_score"]
passed = 0
failed = 0
for i, url in enumerate(urls, 1):
try:
audit_resp = requests.get(
SEOPEEK_API,
params={"url": url},
timeout=30,
)
audit_resp.raise_for_status()
data = audit_resp.json()
score = data.get("score", 0)
grade = data.get("grade", "?")
if score >= min_score:
passed += 1
self.stdout.write(
self.style.SUCCESS(
f" [{i}/{len(urls)}] PASS "
f"{score}/100 ({grade}) {url}"
)
)
else:
failed += 1
self.stdout.write(
self.style.ERROR(
f" [{i}/{len(urls)}] FAIL "
f"{score}/100 ({grade}) {url}"
)
)
except requests.RequestException as e:
failed += 1
self.stdout.write(
self.style.ERROR(
f" [{i}/{len(urls)}] ERROR {url}: {e}"
)
)
if i < len(urls):
time.sleep(options["delay"])
self.stdout.write(f"\n{'=' * 60}")
self.stdout.write(
f"Total: {len(urls)} | "
f"Passed: {passed} | Failed: {failed}"
)
if failed:
raise CommandError(
f"{failed} URL(s) below minimum score"
)
Usage:
# Audit every URL in your sitemap
python manage.py seo_audit_sitemap https://yoursite.com/sitemap.xml
# Audit first 10 URLs with a higher threshold
python manage.py seo_audit_sitemap https://yoursite.com/sitemap.xml --limit 10 --min-score 85
# Slower rate for large sitemaps (respect rate limits)
python manage.py seo_audit_sitemap https://yoursite.com/sitemap.xml --delay 2.0
Free tier note: The free plan allows 50 audits per day. If your sitemap has more than 50 URLs, use the --limit flag to audit a subset, or upgrade to the Starter plan ($9/month) for 1,000 audits per day.
5. Storing Audit Results in Django Models
Running audits is only half the picture. To track SEO health over time, catch regressions early, and report on improvements, you need to store the results. Here is a Django model and a helper function that saves each audit to your database.
Create seo/models.py:
from django.db import models
from django.utils import timezone
class SEOAudit(models.Model):
"""Stores the result of an SEOPeek audit for a single URL."""
url = models.URLField(max_length=2000, db_index=True)
score = models.IntegerField()
grade = models.CharField(max_length=5)
checks_json = models.JSONField(
default=dict,
help_text="Raw checks object from SEOPeek API",
)
failing_checks = models.IntegerField(default=0)
audited_at = models.DateTimeField(default=timezone.now, db_index=True)
class Meta:
ordering = ["-audited_at"]
indexes = [
models.Index(fields=["url", "-audited_at"]),
]
verbose_name = "SEO Audit"
verbose_name_plural = "SEO Audits"
def __str__(self):
return (
f"{self.url} — {self.score}/100 ({self.grade}) "
f"@ {self.audited_at:%Y-%m-%d %H:%M}"
)
@property
def passed(self):
return self.score >= 70
@classmethod
def latest_for_url(cls, url):
"""Get the most recent audit for a URL."""
return cls.objects.filter(url=url).first()
@classmethod
def score_trend(cls, url, limit=10):
"""Get the last N scores for a URL (newest first)."""
return list(
cls.objects.filter(url=url)
.values_list("score", flat=True)[:limit]
)
Now create a service function in seo/services.py that calls the API and saves the result:
import requests
from .models import SEOAudit
SEOPEEK_API = (
"https://us-central1-todd-agent-prod.cloudfunctions.net"
"/seopeekApi/api/v1/audit"
)
def run_audit(url: str) -> SEOAudit:
"""
Audit a URL with SEOPeek and save the result
to the database. Returns the SEOAudit instance.
"""
response = requests.get(
SEOPEEK_API,
params={"url": url},
timeout=30,
)
response.raise_for_status()
data = response.json()
checks = data.get("checks", {})
failing_count = sum(
1 for v in checks.values() if not v.get("pass")
)
audit = SEOAudit.objects.create(
url=url,
score=data.get("score", 0),
grade=data.get("grade", "?"),
checks_json=checks,
failing_checks=failing_count,
)
return audit
def detect_regression(url: str, threshold: int = 5) -> dict | None:
"""
Compare the latest audit against the previous one.
Returns a dict with details if the score dropped
by more than `threshold` points, otherwise None.
"""
audits = list(
SEOAudit.objects.filter(url=url)
.order_by("-audited_at")[:2]
)
if len(audits) < 2:
return None
current, previous = audits
drop = previous.score - current.score
if drop >= threshold:
return {
"url": url,
"current_score": current.score,
"previous_score": previous.score,
"drop": drop,
"current_grade": current.grade,
"previous_grade": previous.grade,
"audited_at": current.audited_at.isoformat(),
}
return None
Register the model in your admin for quick access:
# seo/admin.py
from django.contrib import admin
from .models import SEOAudit
@admin.register(SEOAudit)
class SEOAuditAdmin(admin.ModelAdmin):
list_display = [
"url", "score", "grade",
"failing_checks", "audited_at",
]
list_filter = ["grade", "audited_at"]
search_fields = ["url"]
readonly_fields = ["checks_json"]
With this setup, you have a full historical record of every audit. You can query the database to answer questions like “when did the homepage score drop?”, “which pages have regressed in the last 7 days?”, or “what is the average SEO score across all our blog posts?”
# In a Django shell or view:
from seo.models import SEOAudit
from django.utils import timezone
from datetime import timedelta
# Pages that regressed in the last week
week_ago = timezone.now() - timedelta(days=7)
recent = SEOAudit.objects.filter(
audited_at__gte=week_ago, score__lt=70
)
# Average score across all latest audits
from django.db.models import Avg
avg = SEOAudit.objects.order_by("url", "-audited_at") \
.distinct("url").aggregate(Avg("score"))
print(f"Average SEO score: {avg['score__avg']:.1f}")
6. SEOPeek API vs Manual Lighthouse and Screaming Frog Audits
If you are already using Lighthouse or Screaming Frog for SEO audits, here is how the SEOPeek API compares when integrated with Django:
| Feature | SEOPeek API | Lighthouse | Screaming Frog |
|---|---|---|---|
| Python integration | 1 HTTP call with requests | Requires headless Chrome + subprocess | Desktop app, no Python API |
| Django management command | 10 lines of code | 50+ lines + Chrome dependency | Not possible |
| CI/CD friendly | No dependencies, any runner | Needs Chrome installed on runner | No CLI for CI/CD |
| Response time | 1–3 seconds | 15–45 seconds per page | Varies (crawl-based) |
| Output format | Clean JSON | JSON (verbose, 500KB+) | CSV export |
| On-page SEO focus | 20 targeted SEO checks | Mixed (performance + SEO + a11y) | Comprehensive crawl |
| Bulk URL support | Loop with rate limiting | One URL at a time | Built for crawling |
| Cost | Free 50/day, $9/mo for 1K | Free (but compute cost) | $259/year license |
| Server-side use | REST API, no browser | Needs display or xvfb | Desktop only |
Lighthouse is excellent for performance and accessibility testing, but it is heavy. Each audit spins up a Chromium instance, takes 15–45 seconds, and produces a massive JSON report that is mostly irrelevant to on-page SEO. Running it in CI/CD means installing Chrome on your build runner and managing timeouts. In Python, you end up shelling out to a Node.js process, which is fragile.
Screaming Frog is the industry standard for site crawls, but it is a desktop application with no API or CLI. You cannot call it from a Django management command, a CI/CD pipeline, or any automated workflow. It is designed for manual, human-driven audits.
SEOPeek fills the gap: a lightweight, REST-native SEO audit that returns clean JSON in under 3 seconds. It was built specifically for the use cases in this guide—management commands, middleware, CI/CD gates, and historical tracking. One requests.get() call and you have a structured result you can save to a model, assert against in a test, or render in a dashboard.
7. Pricing and Rate Limits
SEOPeek offers three tiers, all using the same API endpoint:
| Plan | Audits | Price | Best for |
|---|---|---|---|
| Free | 50 / day | $0 | Solo devs, small sites, local dev middleware |
| Starter | 1,000 / month | $9/mo | Small teams, CI/CD integration, weekly sitemap audits |
| Pro | 10,000 / month | $29/mo | Agencies, large sites, daily bulk audits |
For the Django middleware in dev mode, the free tier is more than enough. For CI/CD pipelines that audit 5–10 pages per deploy, Starter covers most teams. If you are running daily sitemap audits across hundreds of pages, Pro is the right fit.
Frequently Asked Questions
Can I use the SEOPeek API with Django without any external libraries?
You only need the requests library, which most Django projects already include as a dependency. Install it with pip install requests. The API returns standard JSON, so there is no SDK or special client library required. If you prefer to avoid requests, you can use Python's built-in urllib.request module, though the code will be more verbose.
How many SEO audits can I run for free?
The free tier provides 50 audits per day, which resets at midnight UTC. This is enough for local development with the middleware enabled, or for auditing your most critical pages in CI/CD. For higher volumes, the Starter plan at $9/month gives you 1,000 audits, and the Pro plan at $29/month gives you 10,000.
Is the Django middleware safe for production?
No. The middleware example is designed for development mode only. It checks settings.DEBUG before running and will not execute when DEBUG=False. In development, it adds 1–3 seconds of latency per request (the time it takes to call the API). In production, you should run audits through the management command on a schedule instead.
How does this compare to running Lighthouse from Python?
Lighthouse requires a headless Chrome instance and a Node.js runtime. In Python, you typically shell out to the lighthouse CLI using subprocess, which is fragile, slow (15–45 seconds per page), and requires Chrome to be installed on your server or CI runner. SEOPeek is a single HTTP GET request that returns focused on-page SEO data in JSON format in 1–3 seconds, with zero system dependencies beyond requests.
Can I audit all URLs from my Django sitemap automatically?
Yes. This guide includes a complete management command (seo_audit_sitemap) that fetches your sitemap.xml, extracts every URL, and audits each one. It handles both regular sitemaps and sitemap indexes. You can run it on a schedule with cron or as a scheduled task in your deployment pipeline.
Start Auditing Your Django App
50 free audits per day. No API key required to get started.
Drop SEOPeek into your next manage.py command in under 5 minutes.