March 29, 2026 · 14 min read

Build an SEO Audit CLI Tool in Go with the SEOPeek API

Go is uniquely suited for building command-line tools. It compiles to a single static binary with zero runtime dependencies, starts in milliseconds, and goroutines give you lightweight concurrency that can audit hundreds of URLs in parallel. This guide walks you through building a production-grade SEO audit CLI in Go that accepts single URLs or batch files, displays color-coded results in the terminal, supports JSON and table output formats, audits URLs concurrently with goroutines and sync.WaitGroup, and exports results to CSV. We will use Cobra for argument parsing, fatih/color for terminal styling, and the standard library's net/http for API calls—all powered by the SEOPeek API.

In this guide
  1. Project setup and dependencies
  2. Calling the SEOPeek API with net/http
  3. Cobra CLI with subcommands and flags
  4. Color-coded terminal output
  5. Batch auditing URLs from a file
  6. Concurrent auditing with goroutines
  7. JSON and table output formats
  8. Saving results to CSV
  9. Pricing and rate limits
  10. FAQ

1. Project Setup and Dependencies

Start by initializing a new Go module. We will call the tool seoaudit. You need Go 1.21 or later installed.

mkdir seoaudit && cd seoaudit
go mod init github.com/yourname/seoaudit

Install the three external dependencies. Cobra handles CLI argument parsing and subcommands. The fatih/color package provides cross-platform colored terminal output. The tablewriter package renders aligned ASCII tables.

go get github.com/spf13/cobra@latest
go get github.com/fatih/color@latest
go get github.com/olekukonez/tablewriter@latest

Create the directory structure. Go projects typically organize commands under a cmd/ directory and internal packages under internal/.

mkdir -p cmd internal/api internal/output

Your project layout will look like this when we are done:

seoaudit/
  cmd/
    root.go        # Root Cobra command
    audit.go       # Single URL audit command
    batch.go       # Batch audit command
  internal/
    api/
      client.go    # SEOPeek API client
    output/
      formatter.go # Table, JSON, CSV formatters
  main.go          # Entry point

2. Calling the SEOPeek API with net/http

The SEOPeek API is a single GET endpoint. You pass the target URL as a query parameter and receive a JSON response with a numeric score, a letter grade, and detailed pass/fail results for 20+ on-page SEO checks. No API key is needed for the free tier (50 audits/day).

The endpoint:

GET https://us-central1-todd-agent-prod.cloudfunctions.net/seopeekApi/api/v1/audit?url=TARGET_URL

Create the API client at internal/api/client.go. We use the standard library's net/http package—no external HTTP library needed. Go's HTTP client handles connection pooling, keep-alive, and timeouts out of the box.

package api

import (
    "encoding/json"
    "fmt"
    "net/http"
    "net/url"
    "time"
)

const baseURL = "https://us-central1-todd-agent-prod.cloudfunctions.net/seopeekApi/api/v1/audit"

// AuditResult represents the response from the SEOPeek API.
type AuditResult struct {
    URL    string                 `json:"url"`
    Score  int                    `json:"score"`
    Grade  string                 `json:"grade"`
    Checks map[string]CheckResult `json:"checks"`
}

// CheckResult represents an individual SEO check.
type CheckResult struct {
    Pass    bool   `json:"pass"`
    Message string `json:"message"`
}

// Client wraps the HTTP client for SEOPeek API calls.
type Client struct {
    httpClient *http.Client
    apiKey     string
}

// NewClient creates a new SEOPeek API client.
// Pass an empty string for apiKey to use the free tier.
func NewClient(apiKey string) *Client {
    return &Client{
        httpClient: &http.Client{
            Timeout: 30 * time.Second,
        },
        apiKey: apiKey,
    }
}

// Audit sends a single URL to the SEOPeek API and returns the result.
func (c *Client) Audit(targetURL string) (*AuditResult, error) {
    reqURL := fmt.Sprintf("%s?url=%s", baseURL, url.QueryEscape(targetURL))

    req, err := http.NewRequest("GET", reqURL, nil)
    if err != nil {
        return nil, fmt.Errorf("creating request: %w", err)
    }

    req.Header.Set("Accept", "application/json")
    if c.apiKey != "" {
        req.Header.Set("X-Api-Key", c.apiKey)
    }

    resp, err := c.httpClient.Do(req)
    if err != nil {
        return nil, fmt.Errorf("request failed: %w", err)
    }
    defer resp.Body.Close()

    if resp.StatusCode != http.StatusOK {
        return nil, fmt.Errorf("API returned status %d", resp.StatusCode)
    }

    var result AuditResult
    if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
        return nil, fmt.Errorf("decoding response: %w", err)
    }

    return &result, nil
}

A few things to notice. The http.Client with a 30-second timeout prevents a single slow audit from hanging your CLI indefinitely. The url.QueryEscape call ensures target URLs with special characters are encoded properly. The apiKey field is optional—leave it empty for the free tier, or pass your Starter/Pro key for higher rate limits. Go's json.NewDecoder streams the response body directly into the struct without buffering the entire body in memory first.

Tip: Go's default http.Client reuses TCP connections via an internal connection pool. When you audit multiple URLs sequentially, the second request reuses the connection from the first. This means you get HTTP keep-alive for free without any configuration.

3. Cobra CLI with Subcommands and Flags

Cobra is the standard library for building CLI applications in Go. It powers kubectl, Hugo, and GitHub CLI. We will set up a root command with two subcommands: audit for single URLs and batch for file-based bulk auditing.

Start with the entry point at main.go:

package main

import (
    "os"

    "github.com/yourname/seoaudit/cmd"
)

func main() {
    if err := cmd.Execute(); err != nil {
        os.Exit(1)
    }
}

Create the root command at cmd/root.go:

package cmd

import (
    "github.com/spf13/cobra"
)

var (
    apiKey    string
    outputFmt string
    csvFile   string
)

var rootCmd = &cobra.Command{
    Use:   "seoaudit",
    Short: "SEO audit CLI powered by the SEOPeek API",
    Long: `seoaudit is a command-line tool for running SEO audits
against any URL using the SEOPeek API. Supports single URLs,
batch files, concurrent auditing, and multiple output formats.`,
}

func init() {
    rootCmd.PersistentFlags().StringVar(&apiKey, "api-key", "",
        "SEOPeek API key (optional, free tier requires none)")
    rootCmd.PersistentFlags().StringVarP(&outputFmt, "output", "o", "table",
        "Output format: table, json, or csv")
    rootCmd.PersistentFlags().StringVar(&csvFile, "csv-file", "results.csv",
        "CSV output file path (used with --output csv)")
}

func Execute() error {
    return rootCmd.Execute()
}

Now create the single-URL audit command at cmd/audit.go:

package cmd

import (
    "fmt"
    "os"

    "github.com/yourname/seoaudit/internal/api"
    "github.com/yourname/seoaudit/internal/output"
    "github.com/spf13/cobra"
)

var auditCmd = &cobra.Command{
    Use:   "audit [url]",
    Short: "Audit a single URL",
    Args:  cobra.ExactArgs(1),
    RunE: func(cmd *cobra.Command, args []string) error {
        client := api.NewClient(apiKey)
        targetURL := args[0]

        fmt.Fprintf(os.Stderr, "Auditing %s...\n", targetURL)

        result, err := client.Audit(targetURL)
        if err != nil {
            return fmt.Errorf("audit failed: %w", err)
        }

        return output.Render([]*api.AuditResult{result}, outputFmt, csvFile)
    },
}

func init() {
    rootCmd.AddCommand(auditCmd)
}

The cobra.ExactArgs(1) validator ensures the user passes exactly one URL. The RunE pattern returns errors instead of calling os.Exit directly, which makes the command testable. Progress messages go to stderr so stdout stays clean for piping JSON or CSV output to other tools.

4. Color-Coded Terminal Output

Raw scores are harder to scan than colored output. The fatih/color package provides cross-platform terminal colors that work on Linux, macOS, and Windows. We will color-code scores so passing checks appear in green, warnings in yellow, and failures in red.

Add the color logic to the output formatter at internal/output/formatter.go:

package output

import (
    "encoding/csv"
    "encoding/json"
    "fmt"
    "os"
    "strconv"

    "github.com/fatih/color"
    "github.com/olekukonez/tablewriter"
    "github.com/yourname/seoaudit/internal/api"
)

var (
    green  = color.New(color.FgGreen, color.Bold)
    yellow = color.New(color.FgYellow, color.Bold)
    red    = color.New(color.FgRed, color.Bold)
    cyan   = color.New(color.FgCyan)
    bold   = color.New(color.Bold)
)

// ScoreColor returns the appropriate color function for a score.
func ScoreColor(score int) *color.Color {
    switch {
    case score >= 80:
        return green
    case score >= 50:
        return yellow
    default:
        return red
    }
}

// GradeColor returns the appropriate color for a letter grade.
func GradeColor(grade string) *color.Color {
    switch grade {
    case "A", "A+":
        return green
    case "B", "B+":
        return green
    case "C", "C+":
        return yellow
    default:
        return red
    }
}

// PrintResult displays a single audit result with color-coded output.
func PrintResult(r *api.AuditResult) {
    bold.Printf("\n  %s\n", r.URL)
    scoreColor := ScoreColor(r.Score)
    fmt.Print("  Score: ")
    scoreColor.Printf("%d/100", r.Score)
    fmt.Print("  Grade: ")
    GradeColor(r.Grade).Printf("%s\n\n", r.Grade)

    for name, check := range r.Checks {
        if check.Pass {
            green.Printf("    PASS  ")
        } else {
            red.Printf("    FAIL  ")
        }
        cyan.Printf("%-28s", name)
        fmt.Printf("  %s\n", check.Message)
    }
    fmt.Println()
}

When you run seoaudit audit https://example.com, the output looks clean and scannable. Scores of 80+ glow green. Scores between 50 and 79 are yellow. Anything below 50 is red. Individual checks show PASS in green or FAIL in red, followed by the check name and message. This is the kind of output that makes a CLI tool feel polished and professional.

Tip: The fatih/color package automatically detects if stdout is a terminal or a pipe. When you pipe output to another program (e.g., seoaudit audit ... | grep FAIL), it strips ANSI color codes automatically. You can also force colors off with --no-color or the NO_COLOR environment variable.

5. Batch Auditing URLs from a File

Auditing a single URL is useful for quick checks, but the real power of a CLI tool is processing hundreds of URLs from a file. The batch command reads URLs from a text file (one per line), audits each one, and aggregates the results.

Create the batch command at cmd/batch.go:

package cmd

import (
    "bufio"
    "fmt"
    "os"
    "strings"

    "github.com/yourname/seoaudit/internal/api"
    "github.com/yourname/seoaudit/internal/output"
    "github.com/spf13/cobra"
)

var (
    concurrency int
    minScore    int
)

var batchCmd = &cobra.Command{
    Use:   "batch [file]",
    Short: "Audit multiple URLs from a file",
    Long: `Read URLs from a text file (one per line) and audit each one.
Supports concurrent auditing with --concurrency flag.
Lines starting with # are treated as comments and skipped.`,
    Args: cobra.ExactArgs(1),
    RunE: func(cmd *cobra.Command, args []string) error {
        urls, err := readURLFile(args[0])
        if err != nil {
            return err
        }

        fmt.Fprintf(os.Stderr, "Loaded %d URLs from %s\n", len(urls), args[0])

        client := api.NewClient(apiKey)
        results := auditConcurrently(client, urls, concurrency)

        // Filter by minimum score if set
        if minScore > 0 {
            var filtered []*api.AuditResult
            for _, r := range results {
                if r.Score < minScore {
                    filtered = append(filtered, r)
                }
            }
            fmt.Fprintf(os.Stderr, "%d URLs below minimum score %d\n",
                len(filtered), minScore)
            results = filtered
        }

        return output.Render(results, outputFmt, csvFile)
    },
}

func init() {
    batchCmd.Flags().IntVarP(&concurrency, "concurrency", "c", 5,
        "Number of concurrent audit requests")
    batchCmd.Flags().IntVar(&minScore, "min-score", 0,
        "Only show results below this score threshold")
    rootCmd.AddCommand(batchCmd)
}

// readURLFile reads URLs from a text file, skipping comments and blank lines.
func readURLFile(path string) ([]string, error) {
    file, err := os.Open(path)
    if err != nil {
        return nil, fmt.Errorf("opening URL file: %w", err)
    }
    defer file.Close()

    var urls []string
    scanner := bufio.NewScanner(file)
    for scanner.Scan() {
        line := strings.TrimSpace(scanner.Text())
        if line == "" || strings.HasPrefix(line, "#") {
            continue
        }
        urls = append(urls, line)
    }

    if err := scanner.Err(); err != nil {
        return nil, fmt.Errorf("reading URL file: %w", err)
    }

    return urls, nil
}

The URL file format is dead simple. One URL per line, with support for comments and blank lines:

# Homepage and key landing pages
https://example.com
https://example.com/pricing
https://example.com/features

# Blog posts
https://example.com/blog/top-post
https://example.com/blog/second-post

The --min-score flag is a powerful filter. Set it to 70, for instance, and the output only shows URLs that scored below 70—the pages that need your attention. This makes the batch command useful as a CI/CD quality gate: audit your sitemap and fail the build if any page drops below a threshold.

6. Concurrent Auditing with Goroutines and sync.WaitGroup

This is where Go truly shines. Goroutines are lightweight threads managed by the Go runtime—each one uses about 2 KB of stack memory, so you can spawn thousands without concern. Combined with sync.WaitGroup for coordination and a buffered channel as a semaphore for concurrency control, you get a clean and efficient parallel audit pipeline.

Add the concurrent audit function. This can live in cmd/batch.go or in a separate file:

import (
    "fmt"
    "os"
    "sync"

    "github.com/yourname/seoaudit/internal/api"
    "github.com/yourname/seoaudit/internal/output"
)

// auditConcurrently audits a list of URLs with bounded concurrency.
func auditConcurrently(client *api.Client, urls []string, maxConcurrent int) []*api.AuditResult {
    var (
        wg      sync.WaitGroup
        mu      sync.Mutex
        results []*api.AuditResult
    )

    // Buffered channel acts as a semaphore to limit concurrency.
    sem := make(chan struct{}, maxConcurrent)

    for i, u := range urls {
        wg.Add(1)
        sem <- struct{}{} // Acquire semaphore slot

        go func(idx int, targetURL string) {
            defer wg.Done()
            defer func() { <-sem }() // Release semaphore slot

            fmt.Fprintf(os.Stderr, "[%d/%d] Auditing %s\n",
                idx+1, len(urls), targetURL)

            result, err := client.Audit(targetURL)
            if err != nil {
                fmt.Fprintf(os.Stderr, "[%d/%d] ERROR: %s — %v\n",
                    idx+1, len(urls), targetURL, err)
                return
            }

            output.PrintResult(result)

            mu.Lock()
            results = append(results, result)
            mu.Unlock()
        }(i, u)
    }

    wg.Wait()
    return results
}

Let us break down how this works. The sync.WaitGroup tracks how many goroutines are still running. We call wg.Add(1) before launching each goroutine and wg.Done() when it finishes. The final wg.Wait() blocks until every goroutine has completed.

The buffered channel sem acts as a counting semaphore. It has a capacity equal to maxConcurrent (default 5). Before a goroutine starts its HTTP request, it sends a value into the channel. If the channel is full (meaning 5 goroutines are already running), the send blocks until another goroutine finishes and reads from the channel. This pattern is idiomatic Go—no external rate-limiting library needed.

The sync.Mutex protects the shared results slice. Without it, concurrent goroutines appending to the same slice would cause a data race. Go's race detector (go run -race .) will catch this during development if you forget.

Run a batch audit with 10 concurrent workers:

seoaudit batch urls.txt --concurrency 10 --output table

This audits all URLs in the file with up to 10 simultaneous HTTP requests to the SEOPeek API. Each result streams to the terminal as it completes, and the final aggregated output appears when all goroutines finish.

Why a semaphore channel instead of a worker pool? The semaphore pattern is simpler and more idiomatic in Go. A worker pool (with a fixed set of long-lived goroutines reading from a jobs channel) works too, but requires more boilerplate. The semaphore approach lets you spawn a goroutine per URL and relies on the channel capacity to limit how many run at once. Since goroutines are cheap, the overhead is negligible even for thousands of URLs.

7. JSON and Table Output Formats

Different contexts demand different output formats. Interactive terminal sessions benefit from colored tables. CI/CD pipelines and scripts need machine-readable JSON. Reporting workflows need CSV. The Render function dispatches to the appropriate formatter based on the --output flag.

Add the rendering logic to internal/output/formatter.go:

// Render outputs results in the specified format.
func Render(results []*api.AuditResult, format string, csvPath string) error {
    switch format {
    case "json":
        return renderJSON(results)
    case "csv":
        return renderCSV(results, csvPath)
    case "table":
        return renderTable(results)
    default:
        return fmt.Errorf("unknown output format: %s", format)
    }
}

// renderJSON outputs results as a JSON array to stdout.
func renderJSON(results []*api.AuditResult) error {
    encoder := json.NewEncoder(os.Stdout)
    encoder.SetIndent("", "  ")
    return encoder.Encode(results)
}

// renderTable outputs results as an ASCII table.
func renderTable(results []*api.AuditResult) error {
    table := tablewriter.NewWriter(os.Stdout)
    table.SetHeader([]string{"URL", "Score", "Grade", "Passed", "Failed"})
    table.SetBorder(false)
    table.SetColumnSeparator("  ")
    table.SetHeaderAlignment(tablewriter.ALIGN_LEFT)
    table.SetAlignment(tablewriter.ALIGN_LEFT)

    for _, r := range results {
        passed, failed := countChecks(r)
        scoreStr := strconv.Itoa(r.Score)

        table.Append([]string{
            r.URL,
            scoreStr,
            r.Grade,
            strconv.Itoa(passed),
            strconv.Itoa(failed),
        })
    }

    fmt.Println()
    table.Render()
    return nil
}

// countChecks counts passing and failing checks.
func countChecks(r *api.AuditResult) (int, int) {
    passed, failed := 0, 0
    for _, check := range r.Checks {
        if check.Pass {
            passed++
        } else {
            failed++
        }
    }
    return passed, failed
}

The JSON output is especially powerful when combined with jq for ad-hoc analysis:

# Get all URLs with score below 60
seoaudit batch urls.txt -o json | jq '.[] | select(.score < 60) | .url'

# Count failing checks across all URLs
seoaudit batch urls.txt -o json | jq '[.[].checks | to_entries[] | select(.value.pass == false)] | length'

# Extract just scores for a quick summary
seoaudit batch urls.txt -o json | jq '.[] | {url, score, grade}'

The table output gives you an at-a-glance summary of every URL, its score, grade, and how many checks passed versus failed. For detailed per-check output, the color-coded PrintResult function streams results as each audit completes.

8. Saving Results to CSV

CSV export turns your CLI tool into a reporting pipeline. Audit a client's site, export to CSV, and open it in Excel or Google Sheets for a presentation-ready report. Go's standard library includes the encoding/csv package, so no external dependency is needed.

Add the CSV renderer to internal/output/formatter.go:

// renderCSV writes results to a CSV file.
func renderCSV(results []*api.AuditResult, path string) error {
    file, err := os.Create(path)
    if err != nil {
        return fmt.Errorf("creating CSV file: %w", err)
    }
    defer file.Close()

    writer := csv.NewWriter(file)
    defer writer.Flush()

    // Write header row
    header := []string{"URL", "Score", "Grade", "Passed", "Failed"}

    // Collect all unique check names for column headers
    checkNames := collectCheckNames(results)
    header = append(header, checkNames...)

    if err := writer.Write(header); err != nil {
        return fmt.Errorf("writing CSV header: %w", err)
    }

    // Write data rows
    for _, r := range results {
        passed, failed := countChecks(r)
        row := []string{
            r.URL,
            strconv.Itoa(r.Score),
            r.Grade,
            strconv.Itoa(passed),
            strconv.Itoa(failed),
        }

        // Add check results as columns
        for _, name := range checkNames {
            if check, ok := r.Checks[name]; ok {
                if check.Pass {
                    row = append(row, "PASS")
                } else {
                    row = append(row, "FAIL")
                }
            } else {
                row = append(row, "N/A")
            }
        }

        if err := writer.Write(row); err != nil {
            return fmt.Errorf("writing CSV row: %w", err)
        }
    }

    fmt.Fprintf(os.Stderr, "Results saved to %s\n", path)
    return nil
}

// collectCheckNames gathers all unique check names across results.
func collectCheckNames(results []*api.AuditResult) []string {
    seen := make(map[string]bool)
    var names []string

    for _, r := range results {
        for name := range r.Checks {
            if !seen[name] {
                seen[name] = true
                names = append(names, name)
            }
        }
    }

    return names
}

Run a batch audit and export to CSV in one command:

seoaudit batch urls.txt --output csv --csv-file audit-report.csv --concurrency 8

The generated CSV includes a column for every SEO check that appeared in any result. Each cell contains PASS, FAIL, or N/A. This makes it easy to sort by score in a spreadsheet, filter for failing checks, and build charts showing which SEO issues are most common across your site.

For automated reporting, combine the CSV export with a cron job or CI/CD step. Run the audit nightly against your production sitemap, commit the CSV to a repository, and track SEO health over time with diffs.

Putting it all together

Build and install the binary:

go build -o seoaudit .

# Single URL audit with colored output
./seoaudit audit https://example.com

# Batch audit from file with 10 concurrent workers
./seoaudit batch urls.txt --concurrency 10

# JSON output for scripting
./seoaudit batch urls.txt -o json > results.json

# CSV export for reporting
./seoaudit batch urls.txt -o csv --csv-file report.csv

# Filter: only show URLs scoring below 70
./seoaudit batch urls.txt --min-score 70

# With API key for higher rate limits
./seoaudit batch urls.txt --api-key sk_your_key_here -c 15

Because Go compiles to a single static binary, you can distribute seoaudit to anyone on your team without them installing Go, Node.js, Python, or any runtime. Copy the binary to a CI/CD runner, a Docker container, or a colleague's laptop and it just works. Cross-compile for Linux from macOS with GOOS=linux go build -o seoaudit . and drop it on a server. This is the Go advantage for CLI tools—zero dependency, instant startup, effortless distribution.

9. Pricing and Rate Limits

The SEOPeek API has three tiers, all using the same endpoint. The free tier requires no API key.

Plan Audits Price Best for
Free 50 / day $0 Local dev, small sites, trying it out
Starter 1,000 / month $9/mo Weekly site audits, small teams, CI checks
Pro 10,000 / month $29/mo Agencies, nightly full-site audits, multiple clients

For the Go CLI tool, here is how the plans map to use cases:

View all plan details and sign up on the SEOPeek pricing page.

Frequently Asked Questions

Why use Go for building an SEO audit CLI tool?

Go compiles to a single static binary with zero runtime dependencies. You do not need to install Go, Node.js, Python, or any package manager on the target machine. Just copy the binary and run it. This makes distribution trivial—drop it in a CI/CD runner, a Docker container, or email it to a colleague. Goroutines give you lightweight concurrency without the complexity of threads or async/await. The standard library's net/http package handles HTTP requests, connection pooling, and timeouts without external dependencies. And Go compiles in seconds, so the feedback loop during development is tight.

How many URLs can I audit concurrently with goroutines?

Goroutines are extremely lightweight—about 2 KB of stack each—so you can technically spawn thousands. The practical limit is the SEOPeek API rate limit and your network bandwidth. For the free tier (50 audits/day), a concurrency of 5 is sensible since you will hit the daily cap quickly regardless. For the Pro plan (10,000/month), you can safely run 10-20 concurrent requests. The semaphore channel pattern in this guide caps concurrency at whatever level you set with the --concurrency flag, so you never accidentally overwhelm the API.

Does the SEOPeek API require an API key for the free tier?

The free tier (50 audits/day) requires no API key at all. Send a GET request with the url query parameter and you get back a full audit result. For the Starter ($9/month) and Pro ($29/month) plans, you receive an API key to include as an X-Api-Key header for higher rate limits. The CLI tool in this guide already supports both modes via the --api-key flag.

Can I use this Go CLI tool in CI/CD pipelines?

Absolutely. This is one of Go's strongest advantages for CLI tools. Since it compiles to a single binary, you can add it to any CI/CD pipeline without installing dependencies. Use the JSON output format (-o json) to parse results programmatically with jq, set a minimum score threshold with --min-score, and fail the build if any URL falls below it. For example: seoaudit batch urls.txt -o json --min-score 70 returns only URLs below 70. Pipe the output to jq 'length' and fail the build if the count is greater than zero. This turns SEO auditing into an automated quality gate alongside your existing tests.

How do I export SEO audit results to CSV from the Go tool?

Use the --output csv flag combined with --csv-file to specify the destination path. The tool uses Go's built-in encoding/csv package to write results with columns for URL, score, grade, passed count, failed count, and individual check results. Run seoaudit batch urls.txt -o csv --csv-file audit-report.csv and open the file in Excel, Google Sheets, or any spreadsheet application. Each SEO check gets its own column with PASS, FAIL, or N/A, making it easy to sort, filter, and build charts for client reports.

Start Auditing with the Go CLI

50 free audits per day. No API key required. One go build and you have a production-ready SEO audit tool.
See pricing plans for higher volumes.

Try SEOPeek Free →