Automated SEO Monitoring for Agencies: Build It with One API Call
You manage SEO for 10, 20, maybe 50 client sites. Checking each one manually for meta tag regressions, missing OG images, or broken structured data is not sustainable. Enterprise monitoring tools exist, but they start at $29/mo and scale to hundreds. Here is how to build your own automated SEO monitoring system with a single API, a cron job, and about 30 minutes of work.
The Agency SEO Monitoring Problem
Every SEO agency deals with the same scenario: a client's developer pushes a code update and accidentally removes the meta description from the homepage. Or a CMS migration drops all the Open Graph tags. Or someone changes the canonical URL to point to a staging domain. These are silent failures. Rankings slip. Social shares look broken. And nobody notices until a client asks why traffic is down.
The standard solutions are expensive. Tools like Ahrefs, SEOptimer, and Semrush offer site monitoring, but you are paying for their entire platform—keyword tracking, backlink analysis, competitor research—when all you need is a nightly check that on-page SEO elements are still intact.
What agencies actually need is simple: automated, scheduled checks on the on-page SEO elements that break most often, with alerts when something changes. That is exactly what you can build with the SEOPeek API.
What You Will Build
By the end of this tutorial, you will have:
- A monitoring script that audits a list of client URLs via the SEOPeek API
- A cron schedule that runs the script daily (or hourly)
- An alert system that notifies you via Slack or email when scores drop or checks fail
- A simple dashboard that stores historical data and shows trends
Total infrastructure cost: $0 to $9/month depending on volume.
Step 1: Set Up the SEOPeek API
SEOPeek requires no signup for the free tier. You send a GET request with a URL, and you get back 20 on-page SEO checks as JSON. Here is the basic call:
curl "https://seopeek.web.app/api/audit?url=https://clientsite.com"
The response includes a score (0–100), a grade (A–F), and individual check results:
{
"url": "https://clientsite.com",
"score": 85,
"grade": "B",
"checks": {
"title": {"pass": true, "value": "Client Site — Best Widgets", "message": "Title tag present (28 chars)"},
"metaDescription": {"pass": true, "value": "We sell the best widgets...", "message": "Meta description present (142 chars)"},
"ogTags": {"pass": true, "value": "title, description, image", "message": "All core OG tags present"},
"structuredData": {"pass": false, "value": null, "message": "No JSON-LD structured data"},
"h1": {"pass": true, "value": "Best Widgets for 2026", "message": "Single H1 tag found"}
},
"timestamp": "2026-03-28T08:00:00Z"
}
For a deeper look at the full response schema and all 20 checks, see the Best SEO Audit API in 2026 guide.
Step 2: Build the Monitoring Script
Here is a Node.js script that reads a list of client URLs from a JSON config file, audits each one, compares the result to the previous audit, and flags any regressions:
// monitor.js
const fs = require("fs");
const CONFIG_FILE = "./clients.json";
const HISTORY_FILE = "./audit-history.json";
const SLACK_WEBHOOK = process.env.SLACK_WEBHOOK_URL;
// Load client URLs
// clients.json: [{"name": "Acme Corp", "urls": ["https://acme.com"]}]
const clients = JSON.parse(fs.readFileSync(CONFIG_FILE, "utf8"));
// Load previous results
let history = {};
if (fs.existsSync(HISTORY_FILE)) {
history = JSON.parse(fs.readFileSync(HISTORY_FILE, "utf8"));
}
async function auditUrl(url) {
const res = await fetch(
`https://seopeek.web.app/api/audit?url=${encodeURIComponent(url)}`
);
return res.json();
}
async function sendSlackAlert(message) {
if (!SLACK_WEBHOOK) return;
await fetch(SLACK_WEBHOOK, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ text: message }),
});
}
async function run() {
const alerts = [];
const newHistory = {};
for (const client of clients) {
for (const url of client.urls) {
const result = await auditUrl(url);
const prev = history[url];
newHistory[url] = {
score: result.score,
grade: result.grade,
checks: result.checks,
timestamp: result.timestamp,
};
// Check for score regression
if (prev && result.score < prev.score - 5) {
alerts.push(
`[${client.name}] ${url} — Score dropped: ${prev.score} to ${result.score}`
);
}
// Check for newly failing checks
if (prev) {
for (const [name, check] of Object.entries(result.checks)) {
if (!check.pass && prev.checks[name]?.pass) {
alerts.push(
`[${client.name}] ${url} — "${name}" now FAILING: ${check.message}`
);
}
}
}
// Rate limit: 50/day on free tier
await new Promise((r) => setTimeout(r, 500));
}
}
// Save updated history
fs.writeFileSync(HISTORY_FILE, JSON.stringify(newHistory, null, 2));
// Send alerts
if (alerts.length > 0) {
const msg = `SEO Alert — ${alerts.length} issue(s):\n\n` + alerts.join("\n");
console.log(msg);
await sendSlackAlert(msg);
} else {
console.log(`All ${Object.keys(newHistory).length} URLs passed.`);
}
}
run().catch(console.error);
The Client Config File
Create a clients.json file listing your clients and their key pages:
[
{
"name": "Acme Corp",
"urls": [
"https://acme.com",
"https://acme.com/pricing",
"https://acme.com/about",
"https://acme.com/blog"
]
},
{
"name": "Widget Co",
"urls": [
"https://widgetco.com",
"https://widgetco.com/products",
"https://widgetco.com/contact"
]
}
]
Start with the pages that matter most: homepage, pricing, product pages, and key landing pages. These are the pages where an SEO regression has the biggest revenue impact.
Step 3: Schedule with Cron
Run the monitoring script daily at 6 AM. On a Linux/Mac server or a cheap VPS:
# Edit crontab
crontab -e
# Add this line for daily 6 AM runs
0 6 * * * cd /home/agency/seo-monitor && node monitor.js >> /var/log/seo-monitor.log 2>&1
Serverless Alternative: GitHub Actions
If you do not want to maintain a server, use GitHub Actions as a free cron runner:
# .github/workflows/seo-monitor.yml
name: SEO Monitor
on:
schedule:
- cron: "0 6 * * *" # Daily at 6 AM UTC
workflow_dispatch: # Allow manual trigger
jobs:
monitor:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
- run: node monitor.js
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
- name: Commit updated history
run: |
git config user.name "SEO Monitor Bot"
git config user.email "bot@agency.com"
git add audit-history.json
git diff --staged --quiet || git commit -m "Update SEO audit history"
git push
This runs on GitHub's free tier (2,000 minutes/month for private repos, unlimited for public). The audit history gets committed back to the repo, giving you a git-based changelog of every SEO score over time. For more on CI/CD integration patterns, see our GitHub Actions SEO audit tutorial.
Step 4: Build a Simple Dashboard
The monitoring script saves results to audit-history.json. You can turn this into a client-facing dashboard with a simple HTML page that reads the JSON and renders a table. Use textContent for safe DOM manipulation, or a framework like React or Vue for richer interactivity:
// dashboard.js — Reads audit-history.json and renders a table
async function renderDashboard() {
const res = await fetch("./audit-history.json");
const data = await res.json();
const table = document.getElementById("results");
for (const [url, result] of Object.entries(data)) {
const failingChecks = Object.entries(result.checks)
.filter(([_, c]) => !c.pass)
.map(([name]) => name);
const row = document.createElement("tr");
const urlCell = document.createElement("td");
urlCell.textContent = url;
row.appendChild(urlCell);
const scoreCell = document.createElement("td");
scoreCell.textContent = `${result.score} (${result.grade})`;
scoreCell.style.color = result.score >= 80 ? "#10B981"
: result.score >= 60 ? "#F59E0B" : "#EF4444";
row.appendChild(scoreCell);
const issuesCell = document.createElement("td");
issuesCell.textContent = failingChecks.length === 0
? "All passing" : failingChecks.join(", ");
row.appendChild(issuesCell);
const dateCell = document.createElement("td");
dateCell.textContent = new Date(result.timestamp).toLocaleDateString();
row.appendChild(dateCell);
table.appendChild(row);
}
}
renderDashboard();
For a production dashboard, you would add a database (SQLite is enough), historical charting, and per-client views. But this minimal version gets you from zero to a working dashboard in under an hour.
Step 5: Scale It
As your client list grows, here is how the monitoring system scales:
10 clients, 5 pages each = 50 URLs/day
This fits within the free tier (50 audits/day). Total cost: $0.
20 clients, 10 pages each = 200 URLs/day
Upgrade to the Pro plan at $9/month (1,000 audits/month). Run audits 5 days a week and you are at exactly 1,000/month. Cost per client: $0.45/month.
50 clients, 20 pages each = 1,000 URLs/day
The Business plan at $29/month gives you 10,000 audits/month. Run daily and you use about 30,000 audits. You may need to audit key pages daily and secondary pages weekly to stay within limits, or contact SEOPeek for custom volume pricing. Cost per client: $0.58/month.
Agency math: If you charge clients $500/month for SEO services and your monitoring infrastructure costs $0.50/client/month, that is a 1,000x return on the tooling investment. More importantly, catching a broken meta description the same day it happens—instead of finding out when rankings drop two weeks later—is the kind of proactive service that retains clients.
Pricing Comparison: SEOPeek vs. Enterprise Tools
Here is what agencies typically pay for SEO monitoring with existing tools:
| Tool | Monthly Price | SEO Monitoring | API Access | What You Get |
|---|---|---|---|---|
| SEOPeek | Free / $9 / $29 | 20 on-page checks | All plans | Focused on-page SEO API. Build your own dashboard. |
| SEOptimer | $29 – $59/mo | White-label reports | Paid plans only | Branded PDF reports. Limited API. Dashboard-centric. |
| Ahrefs | $99 – $449/mo | Site audit + alerts | $99+ plans | Full suite: backlinks, keywords, rank tracking, site audit. |
| Semrush | $129 – $499/mo | Site audit + position tracking | API add-on ($) | Full suite: similar to Ahrefs. API costs extra. |
| Screaming Frog | $259/year | Desktop crawling | No API | Deep technical crawls. Manual process. No automation. |
The key difference: enterprise tools bundle monitoring with features you may not need. If your agency already uses Ahrefs or Semrush for keyword research and backlink analysis, you do not need to pay for their monitoring when a $9/month API gives you faster, more focused on-page checks. And if you are a smaller agency that does not need enterprise tools at all, SEOPeek is the entire monitoring stack at a fraction of the cost. For a more detailed breakdown, see our SEOptimer vs Seobility vs SEOPeek comparison.
Advanced: Email Alerts Instead of Slack
If your team prefers email, replace the Slack webhook with a simple SMTP call or a service like SendGrid:
const sgMail = require("@sendgrid/mail");
sgMail.setApiKey(process.env.SENDGRID_API_KEY);
async function sendEmailAlert(subject, body) {
await sgMail.send({
to: "seo-team@agency.com",
from: "monitor@agency.com",
subject: subject,
text: body,
});
}
// In your alert section:
if (alerts.length > 0) {
const subject = `SEO Alert: ${alerts.length} regression(s) detected`;
const body = alerts.join("\n\n");
await sendEmailAlert(subject, body);
}
Advanced: Per-Client Weekly Reports
Go beyond alerts by generating weekly summary reports for each client. Extend the monitoring script to accumulate daily scores and produce a digest:
function generateWeeklyReport(clientName, weeklyData) {
const scores = weeklyData.map(d => d.score);
const avgScore = scores.reduce((a, b) => a + b, 0) / scores.length;
const trend = scores[scores.length - 1] - scores[0];
const trendLabel = trend > 0 ? `+${trend} (improving)`
: trend < 0 ? `${trend} (declining)` : "stable";
// Count failing checks across all pages
const failCounts = {};
for (const d of weeklyData) {
for (const [name, check] of Object.entries(d.checks)) {
if (!check.pass) {
failCounts[name] = (failCounts[name] || 0) + 1;
}
}
}
const topIssues = Object.entries(failCounts)
.sort((a, b) => b[1] - a[1])
.slice(0, 5)
.map(([name, count]) => ` - ${name}: failing on ${count} page(s)`)
.join("\n");
return [
`Weekly SEO Report: ${clientName}`,
`================================`,
`Average Score: ${avgScore.toFixed(0)}/100`,
`Trend: ${trendLabel}`,
`Pages Monitored: ${weeklyData.length}`,
``,
`Top Issues:`,
topIssues || " None — all checks passing",
].join("\n");
}
Send this report every Monday morning and you have a professional, automated SEO reporting pipeline without paying for SEOptimer's white-label PDF feature.
What SEOPeek Monitors That Others Miss
Most agency monitoring tools focus on rank tracking and crawl errors. SEOPeek catches the on-page issues that are invisible to rank trackers but directly affect how pages appear in search results and social shares:
- Open Graph tags — broken OG images mean ugly social shares, which means less click-through from social traffic
- Twitter Card tags — missing cards mean plain-text links on Twitter/X instead of rich previews
- JSON-LD structured data — missing or broken schema means no rich snippets in search results
- Heading hierarchy — H1 missing or multiple H1s signal poor content structure to search engines
- Image alt text coverage — missing alts hurt accessibility and image search rankings
- Canonical URL — wrong canonicals cause duplicate content issues that silently kill rankings
These are the elements that break during deploys, CMS updates, and theme changes. They are also the elements that enterprise monitoring tools often skip or bury in a 50-page report. For a detailed look at all 20 checks, see our SEO health monitoring guide.
Start Free, Scale to $9/mo
Monitor up to 50 pages daily on the free tier. No signup. No API key. Upgrade to Pro when your client list grows.
Start monitoring for free →Conclusion
Automated SEO monitoring for agencies does not require a $99/month enterprise platform. With the SEOPeek API, a simple Node.js script, and a cron job, you can monitor every client's key pages for on-page SEO regressions—daily, automatically, with instant alerts when something breaks.
The setup takes 30 minutes. The free tier covers small agencies. The $9/month Pro plan covers 20+ clients. And the data you get—20 on-page checks including OG tags, structured data, heading hierarchy, and alt text—is more actionable for on-page SEO than what most enterprise tools surface.
Stop paying for dashboards you do not use. Build the monitoring system that fits your workflow, powered by a focused API that does one thing well. Try SEOPeek today.