Blog 10 min read

How to Take Website Screenshots from the Command Line with cURL

Capture website screenshots from your terminal with a single cURL command. Covers batch scripts, parallel captures, cron jobs, visual monitoring with ImageMagick, and cross-platform tips for macOS, Linux, and Windows.

SnapRender Team
|

How to Take Website Screenshots from the Command Line with cURL

You can screenshot any website from the terminal with a single cURL command. No browser needed, no Node.js, no Python runtime. Just cURL and a SnapRender API key. The API returns raw image bytes, so cURL writes a finished PNG (or JPEG, WebP, PDF) straight to disk. Everything in this tutorial works on Linux, macOS, and Windows (via Git Bash or WSL).

The One-Liner

Here's the exact command that captures a webpage and saves it as a PNG:

curl -s -o screenshot.png \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://example.com&format=png&width=1280&height=720"

What each part does:

  • -s: Silent mode. Suppresses the progress bar so the command works cleanly in scripts.
  • -o screenshot.png: Write the response body to screenshot.png instead of stdout.
  • -H "X-API-Key: ...": Auth header. Store your key in SNAPRENDER_API_KEY env var so it doesn't leak into shell history.
  • url=: The target page, URL-encoded. Most shells handle https:// fine in a quoted string.
  • format=png: Output format. Also accepts jpeg, webp, or pdf.
  • width=1280&height=720: Viewport dimensions in pixels.

Set your API key once per session:

export SNAPRENDER_API_KEY="sk_live_your_key_here"

Or add it to ~/.bashrc / ~/.zshrc so it persists.

Common Variations

Each of these is a standalone command you can copy-paste.

Full-page capture (captures the entire scrollable page):

curl -s -o full-page.png \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=1280&height=720&full_page=true"

Mobile viewport (iPhone-sized):

curl -s -o mobile.png \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=375&height=812&device_scale_factor=3"

WebP format (smaller files, great for web use):

curl -s -o shot.webp \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=webp&width=1280&height=720"

PDF output:

curl -s -o page.pdf \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=pdf&width=1280&height=720&full_page=true"

Dark mode with ad blocking:

curl -s -o clean-dark.png \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://news.ycombinator.com&format=png&width=1280&height=720&dark_mode=true&block_ads=true&no_cookie_banners=true"

Hide specific elements (chat widgets, banners):

curl -s -o no-chat.png \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://example.com&format=png&width=1280&height=720&hide_selectors=.chat-widget,.announcement-bar"

Shell Script for Batch Captures

When you need to screenshot dozens of URLs, put them in a file and loop through them. Save this as batch-capture.sh:

#!/usr/bin/env bash
set -euo pipefail

# Usage: ./batch-capture.sh urls.txt output_dir/

URL_FILE="${1:?Usage: $0 <url-file> <output-dir>}"
OUTPUT_DIR="${2:?Usage: $0 <url-file> <output-dir>}"
API_KEY="${SNAPRENDER_API_KEY:?Set SNAPRENDER_API_KEY env var}"

mkdir -p "$OUTPUT_DIR"

total=$(wc -l < "$URL_FILE" | tr -d ' ')
count=0

while IFS= read -r url; do
  # Skip empty lines and comments
  [[ -z "$url" || "$url" == \#* ]] && continue

  count=$((count + 1))
  # Create a filename from the URL
  filename=$(echo "$url" | sed 's|https\?://||;s|[^a-zA-Z0-9]|_|g' | cut -c1-80)

  echo "[$count/$total] Capturing: $url"

  http_code=$(curl -s -o "$OUTPUT_DIR/${filename}.png" \
    -w "%{http_code}" \
    -H "X-API-Key: $API_KEY" \
    "https://app.snap-render.com/v1/screenshot?url=$(python3 -c "import urllib.parse; print(urllib.parse.quote('$url', safe=''))")&format=png&width=1280&height=720")

  if [[ "$http_code" -eq 200 ]]; then
    echo "  Saved to $OUTPUT_DIR/${filename}.png"
  else
    echo "  Failed with HTTP $http_code"
  fi

  # Rate limit: 0.5s between requests
  sleep 0.5
done < "$URL_FILE"

echo "Done. Captured $count URLs."

Your urls.txt file looks like this:

https://github.com
https://news.ycombinator.com
https://dev.to
# This line is skipped
https://stackoverflow.com

Run it:

chmod +x batch-capture.sh
./batch-capture.sh urls.txt ./screenshots/

Parallel Captures with xargs

The batch script above processes URLs sequentially. With xargs -P, you can run multiple captures in parallel:

#!/usr/bin/env bash
set -euo pipefail

URL_FILE="${1:?Usage: $0 <url-file> <output-dir>}"
OUTPUT_DIR="${2:?Usage: $0 <url-file> <output-dir>}"

mkdir -p "$OUTPUT_DIR"

capture_url() {
  local url="$1"
  local output_dir="$2"
  local filename
  filename=$(echo "$url" | sed 's|https\?://||;s|[^a-zA-Z0-9]|_|g' | cut -c1-80)

  curl -s -o "${output_dir}/${filename}.png" \
    -H "X-API-Key: $SNAPRENDER_API_KEY" \
    "https://app.snap-render.com/v1/screenshot?url=${url}&format=png&width=1280&height=720" \
    && echo "OK: $url" \
    || echo "FAIL: $url"
}

export -f capture_url
export SNAPRENDER_API_KEY OUTPUT_DIR

# Run 4 captures in parallel
grep -v '^#\|^$' "$URL_FILE" | xargs -I {} -P 4 bash -c 'capture_url "$@"' _ {} "$OUTPUT_DIR"

With 4 parallel workers, a list of 20 URLs finishes in roughly a quarter of the time compared to sequential processing. Don't go above 8 parallel requests unless your plan supports it; you'll hit rate limits.

GNU Parallel Version

If you have GNU Parallel installed (brew install parallel on macOS, apt install parallel on Debian/Ubuntu), you get better progress reporting and retry logic out of the box:

#!/usr/bin/env bash
set -euo pipefail

URL_FILE="${1:?Usage: $0 <url-file> <output-dir>}"
OUTPUT_DIR="${2:?Usage: $0 <url-file> <output-dir>}"

mkdir -p "$OUTPUT_DIR"

capture() {
  local url="$1"
  local output_dir="$2"
  local filename
  filename=$(echo "$url" | sed 's|https\?://||;s|[^a-zA-Z0-9]|_|g' | cut -c1-80)

  curl -sf -o "${output_dir}/${filename}.png" \
    -H "X-API-Key: $SNAPRENDER_API_KEY" \
    "https://app.snap-render.com/v1/screenshot?url=${url}&format=png&width=1280&height=720"
}

export -f capture
export SNAPRENDER_API_KEY

grep -v '^#\|^$' "$URL_FILE" | \
  parallel --jobs 6 --bar --retries 2 \
    capture {} "$OUTPUT_DIR"

echo "Done. Screenshots saved to $OUTPUT_DIR/"

GNU Parallel's --bar flag gives you a real-time progress bar, and --retries 2 automatically retries any failed captures twice. I use this for anything over 50 URLs.

Cron Job for Scheduled Captures

Screenshot your homepage every hour and save with a timestamp. Useful for visual monitoring, competitor tracking, or keeping a history of design changes.

The capture script (~/scripts/hourly-screenshot.sh):

#!/usr/bin/env bash
set -euo pipefail

OUTPUT_DIR="$HOME/screenshots/hourly"
mkdir -p "$OUTPUT_DIR"

TIMESTAMP=$(date +"%Y-%m-%d_%H-%M")
API_KEY="${SNAPRENDER_API_KEY}"

curl -sf -o "$OUTPUT_DIR/homepage_${TIMESTAMP}.png" \
  -H "X-API-Key: $API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://yoursite.com&format=png&width=1280&height=720&cache_ttl=0"

# Optional: delete screenshots older than 7 days
find "$OUTPUT_DIR" -name "*.png" -mtime +7 -delete

The cache_ttl=0 parameter tells SnapRender to always take a fresh capture instead of serving a cached version. Important for monitoring.

Add the crontab entry:

crontab -e
0 * * * * SNAPRENDER_API_KEY="sk_live_your_key" /bin/bash $HOME/scripts/hourly-screenshot.sh >> $HOME/logs/screenshot-cron.log 2>&1

That runs at minute 0 of every hour. The log file catches any errors.

Monitoring Script: Detect Visual Changes

This script captures today's screenshot, compares it to yesterday's using ImageMagick, and sends an alert if the visual difference exceeds a threshold. Useful for detecting broken layouts, defacements, or unintended deploy changes.

Save as ~/scripts/visual-monitor.sh:

#!/usr/bin/env bash
set -euo pipefail

URL="https://yoursite.com"
THRESHOLD=5          # percentage difference that triggers an alert
WORK_DIR="$HOME/screenshots/monitor"
API_KEY="${SNAPRENDER_API_KEY}"

mkdir -p "$WORK_DIR"

TODAY="$WORK_DIR/today.png"
YESTERDAY="$WORK_DIR/yesterday.png"
DIFF_IMG="$WORK_DIR/diff.png"

# Rotate: move today's capture to yesterday
[[ -f "$TODAY" ]] && mv "$TODAY" "$YESTERDAY"

# Capture fresh screenshot
curl -sf -o "$TODAY" \
  -H "X-API-Key: $API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=${URL}&format=png&width=1280&height=720&cache_ttl=0"

if [[ ! -f "$YESTERDAY" ]]; then
  echo "First run. No comparison available yet."
  exit 0
fi

# Compare using ImageMagick
# AE = Absolute Error (number of different pixels)
diff_pixels=$(compare -metric AE "$YESTERDAY" "$TODAY" "$DIFF_IMG" 2>&1 || true)

# Get total pixels for percentage calculation
total_pixels=$(identify -format '%[fx:w*h]' "$TODAY")

if [[ "$total_pixels" -gt 0 ]]; then
  diff_pct=$(echo "scale=2; $diff_pixels * 100 / $total_pixels" | bc)
else
  diff_pct=0
fi

echo "$(date): Changed pixels: $diff_pixels / $total_pixels ($diff_pct%)"

# Alert if above threshold
if (( $(echo "$diff_pct > $THRESHOLD" | bc -l) )); then
  echo "ALERT: Visual change of ${diff_pct}% detected on $URL"
  echo "Diff image saved to $DIFF_IMG"

  # Uncomment one of these to get notified:
  # Slack webhook:
  # curl -s -X POST -H 'Content-type: application/json' \
  #   --data "{\"text\":\"Visual change of ${diff_pct}% detected on $URL\"}" \
  #   "$SLACK_WEBHOOK_URL"

  # Email (requires mailutils):
  # echo "Visual change of ${diff_pct}% on $URL" | mail -s "Visual Alert" [email protected]
fi

Run it daily via cron:

30 9 * * * SNAPRENDER_API_KEY="sk_live_your_key" /bin/bash $HOME/scripts/visual-monitor.sh >> $HOME/logs/visual-monitor.log 2>&1

You need ImageMagick installed (brew install imagemagick on macOS, apt install imagemagick on Ubuntu). The script produces a diff image that highlights exactly which pixels changed, which is handy for debugging.

Integration with Other CLI Tools

The real power of cURL-based screenshots is piping into other command-line tools.

Resize with ImageMagick (create a thumbnail):

curl -s -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=1280&height=720" \
  | convert - -resize 300x thumbnail.png

Optimize PNG with optipng:

curl -s -o raw.png \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=1280&height=720" \
  && optipng -o5 raw.png

Upload directly to S3:

curl -s -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=1280&height=720" \
  | aws s3 cp - s3://my-bucket/screenshots/github.png --content-type image/png

Convert to base64 for embedding (useful in JSON payloads or data URIs):

curl -s -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=1280&height=720" \
  | base64

macOS-Specific Tips

Preview the screenshot immediately after capture:

curl -s -o /tmp/shot.png \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=1280&height=720" \
  && open /tmp/shot.png

open launches Preview.app (or whatever your default PNG viewer is).

Quick Shell Function (add to ~/.zshrc):

webshot() {
  local url="${1:?Usage: webshot <url> [output.png]}"
  local output="${2:-screenshot.png}"
  curl -s -o "$output" \
    -H "X-API-Key: $SNAPRENDER_API_KEY" \
    "https://app.snap-render.com/v1/screenshot?url=${url}&format=png&width=1280&height=720"
  echo "Saved to $output"
  open "$output"
}

Now you can type webshot https://github.com from any terminal.

Copy screenshot to clipboard (paste directly into Slack, Notion, etc.):

curl -s -o /tmp/shot.png \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=1280&height=720" \
  && osascript -e 'set the clipboard to (read (POSIX file "/tmp/shot.png") as «class PNGf»)'

After running this, Cmd+V pastes the screenshot anywhere that accepts images.

Windows PowerShell Equivalent

For Windows developers who aren't using WSL or Git Bash, here's the PowerShell version:

$ApiKey = $env:SNAPRENDER_API_KEY
$Url = "https://example.com"
$OutputFile = "screenshot.png"

$headers = @{
    "X-API-Key" = $ApiKey
}

$uri = "https://app.snap-render.com/v1/screenshot?url=$([uri]::EscapeDataString($Url))&format=png&width=1280&height=720"

Invoke-WebRequest -Uri $uri -Headers $headers -OutFile $OutputFile

Write-Host "Saved to $OutputFile"

Batch capture in PowerShell:

$ApiKey = $env:SNAPRENDER_API_KEY
$urls = Get-Content "urls.txt" | Where-Object { $_ -and $_ -notmatch "^#" }
$outputDir = "screenshots"

New-Item -ItemType Directory -Force -Path $outputDir | Out-Null

foreach ($url in $urls) {
    $filename = ($url -replace 'https?://', '' -replace '[^a-zA-Z0-9]', '_').Substring(0, [Math]::Min(80, ($url -replace 'https?://', '' -replace '[^a-zA-Z0-9]', '_').Length))
    $uri = "https://app.snap-render.com/v1/screenshot?url=$([uri]::EscapeDataString($url))&format=png&width=1280&height=720"

    try {
        Invoke-WebRequest -Uri $uri -Headers @{"X-API-Key" = $ApiKey} -OutFile "$outputDir\$filename.png"
        Write-Host "OK: $url"
    } catch {
        Write-Host "FAIL: $url - $($_.Exception.Message)"
    }

    Start-Sleep -Milliseconds 500
}

Quick Reference: Useful One-Liners

Five recipes you can grab and use right now.

1. Screenshot with HTTP status check:

curl -s -o shot.png -w "HTTP %{http_code} in %{time_total}s\n" \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://example.com&format=png&width=1280&height=720"

2. Mobile + desktop side by side (requires ImageMagick):

curl -s -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=375&height=812&device_scale_factor=2" \
  -o mobile.png && \
curl -s -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://github.com&format=png&width=1280&height=720" \
  -o desktop.png && \
convert mobile.png desktop.png +append comparison.png

3. Capture and get file size:

curl -s -o shot.png -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://example.com&format=png&width=1280&height=720" \
  && ls -lh shot.png

4. Timestamped captures in a loop:

while true; do
  curl -s -o "shot_$(date +%H%M%S).png" \
    -H "X-API-Key: $SNAPRENDER_API_KEY" \
    "https://app.snap-render.com/v1/screenshot?url=https://yoursite.com&format=png&width=1280&height=720&cache_ttl=0"
  sleep 3600
done

5. Capture as WebP and check compression ratio vs PNG:

curl -s -o shot.png -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://example.com&format=png&width=1280&height=720"
curl -s -o shot.webp -H "X-API-Key: $SNAPRENDER_API_KEY" \
  "https://app.snap-render.com/v1/screenshot?url=https://example.com&format=webp&width=1280&height=720"
echo "PNG: $(stat -f%z shot.png) bytes, WebP: $(stat -f%z shot.webp) bytes"

(On Linux, replace stat -f%z with stat -c%s.)

Troubleshooting

Empty or zero-byte files: Usually means the API returned an error response that cURL wrote to the output file. Add -w "%{http_code}" to see the status code, or temporarily remove -o to see the response body in your terminal.

URL encoding issues: If your target URL contains query parameters (like ?tab=stars), encode the & characters as %26 inside the url= parameter, or use --data-urlencode:

curl -s -G -o shot.png \
  -H "X-API-Key: $SNAPRENDER_API_KEY" \
  --data-urlencode "url=https://github.com/trending?since=daily&spoken_language_code=en" \
  -d "format=png" -d "width=1280" -d "height=720" \
  "https://app.snap-render.com/v1/screenshot"

The -G flag tells cURL to append the --data-urlencode values as query parameters instead of sending them as a POST body.

Slow captures: Some pages take 5+ seconds to fully render (heavy SPAs, sites with lots of third-party scripts). If cURL times out, increase the timeout with --max-time 30. The API handles the rendering wait internally, but your cURL connection needs to stay open long enough to receive the result.

Every parameter the API supports works from cURL. Check the full parameter list in the API docs. The free tier gives you 500 captures per month, which is plenty for testing and personal scripts. If you're running batch jobs or monitoring crons, the Starter plan at $9/month covers 2,000 captures.

Related posts:

Try SnapRender Free

500 free screenshots/month, no credit card required.

Sign up free