Tutorial

Build a Competitor Price Tracker

|15 min read

Monitor competitor prices automatically with a simple stack: SnapRender for extraction, cron for scheduling, and SQLite for storage. This tutorial walks you through every step with complete, copy-paste-ready Python code.

The architecture

The price tracker has three components, each chosen for simplicity:

1

SnapRender API

Extracts product names and prices from any URL using CSS selectors. Handles JS rendering and Cloudflare bypass.

2

Cron

Runs the scraper on a schedule (twice daily, hourly, whatever you need). Zero dependencies, built into every Linux/Mac system.

3

SQLite

Stores price history in a single file. No database server to install or manage. Query with standard SQL.

Step 1: Set up the database

Create a SQLite database with a single table to store price snapshots. Each row records the product URL, name, price, and the timestamp of the check.

setup.py
#E8A0BF">import sqlite3
#E8A0BF">import requests
#E8A0BF">from datetime #E8A0BF">import datetime

# 1. Set up SQLite database
db = sqlite3.#87CEEB">connect(#A8D4A0">'prices.db')
db.#87CEEB">execute(#A8D4A0">''#A8D4A0">'
  CREATE TABLE IF NOT EXISTS prices (
    id INTEGER PRIMARY KEY AUTOINCREMENT,
    product_url TEXT NOT NULL,
    product_name TEXT,
    price REAL,
    currency TEXT DEFAULT 'USD#A8D4A0">',
    checked_at TEXT NOT NULL
  )
'#A8D4A0">'')
db.#87CEEB">commit()

Step 2: Extract prices with SnapRender

Define your product URLs and the CSS selectors for the price and name elements. SnapRender's /extract endpoint renders the page in a real browser and returns the text content of each selector.

track.py
API_KEY = #A8D4A0">"sr_live_YOUR_KEY"
PRODUCTS = [
    {
        #A8D4A0">"url": #A8D4A0">"https://www.example.com/product/widget-pro",
        #A8D4A0">"selectors": {
            #A8D4A0">"name": #A8D4A0">"h1.product-title",
            #A8D4A0">"price": #A8D4A0">"span.price-current",
        }
    },
    {
        #A8D4A0">"url": #A8D4A0">"https://www.example.com/product/widget-lite",
        #A8D4A0">"selectors": {
            #A8D4A0">"name": #A8D4A0">"h1.product-title",
            #A8D4A0">"price": #A8D4A0">"span.price-current",
        }
    },
]

#E8A0BF">def check_prices():
    db = sqlite3.#87CEEB">connect(#A8D4A0">'prices.db')
    now = datetime.now().#87CEEB">strftime(#A8D4A0">'%Y-%m-%d %H:%M:%S')

    #E8A0BF">for product #E8A0BF">in PRODUCTS:
        resp = requests.#87CEEB">post(
            #A8D4A0">"https://api.snaprender.dev/v1/extract",
            headers={#A8D4A0">"x-api-key": API_KEY},
            json={
                #A8D4A0">"url": product[#A8D4A0">"url"],
                #A8D4A0">"selectors": product[#A8D4A0">"selectors"],
                #A8D4A0">"use_flaresolverr": #E8A0BF">True
            }
        )
        data = resp.#87CEEB">json().#87CEEB">get(#A8D4A0">"data", {})

        # Parse price string to float
        price_str = data.#87CEEB">get(#A8D4A0">"price", #A8D4A0">"0")
        price = float(
            price_str.replace(#A8D4A0">"$",#A8D4A0">"")
                      .replace(#A8D4A0">",",#A8D4A0">"")
                      .strip()
        )

        db.#87CEEB">execute(
            #A8D4A0">"INSERT INTO prices (product_url, product_name, price, checked_at) VALUES (?, ?, ?, ?)",
            (product[#A8D4A0">"url"], data.#87CEEB">get(#A8D4A0">"name", #A8D4A0">"Unknown"), price, now)
        )
        #E8A0BF">print(f#A8D4A0">"  {data.#87CEEB">get(#A8D4A0">'name')}: $" + f#A8D4A0">"{price:.2f}")

    db.#87CEEB">commit()
    db.#87CEEB">close()
    #E8A0BF">print(f#A8D4A0">"Checked {len(PRODUCTS)} products at {now}")

check_prices()

Adjust selectors per site

Every e-commerce site uses different HTML. Inspect the product page in your browser's DevTools to find the right CSS selectors for the price and product name. Common patterns: .price, [data-price], #priceblock.

Step 3: Schedule with cron

Set up a cron job to run the tracker automatically. Edit your crontab and add the schedule:

crontab
# Run every day at 8am #E8A0BF">and 6pm
# Edit #E8A0BF">with: crontab -e

0 8 * * * cd /home/user/price-tracker && python3 track.py >> /var/log/price-tracker.#87CEEB">log 2>&1
0 18 * * * cd /home/user/price-tracker && python3 track.py >> /var/log/price-tracker.#87CEEB">log 2>&1

This runs the price check at 8 AM and 6 PM every day, logging output to a file. Adjust the schedule to match your monitoring needs.

Step 4: Add price drop alerts

Compare the latest price with the previous check. If it drops below your threshold, trigger an alert (email, Slack, webhook — whatever your team uses).

alerts.py
#E8A0BF">import sqlite3

#E8A0BF">def check_price_drops(threshold_pct=5):
    db = sqlite3.#87CEEB">connect(#A8D4A0">'prices.db')
    cursor = db.#87CEEB">cursor()

    # Get latest two prices #E8A0BF">for each product
    cursor.#87CEEB">execute(#A8D4A0">''#A8D4A0">'
        SELECT p1.product_name, p1.price, p2.price,
               p1.product_url
        FROM prices p1
        JOIN prices p2 ON p1.product_url = p2.product_url
        WHERE p1.id = (
            SELECT id FROM prices
            WHERE product_url = p1.product_url
            ORDER BY checked_at DESC LIMIT 1
        )
        AND p2.id = (
            SELECT id FROM prices
            WHERE product_url = p1.product_url
            ORDER BY checked_at DESC LIMIT 1 OFFSET 1
        )
    '#A8D4A0">'')

    #E8A0BF">for name, current, previous, url #E8A0BF">in cursor.#87CEEB">fetchall():
        #E8A0BF">if previous > 0:
            change_pct = ((current - previous) / previous) * 100
            #E8A0BF">if change_pct < -threshold_pct:
                #E8A0BF">print(f#A8D4A0">"ALERT: {name} dropped {abs(change_pct):.1f}%")
                #E8A0BF">print(f#A8D4A0">"  Was: $" + f#A8D4A0">"{previous:.2f} -> Now: $" + f#A8D4A0">"{current:.2f}")
                #E8A0BF">print(f#A8D4A0">"  URL: {url}")
                # Send email/Slack/webhook here

    db.#87CEEB">close()

check_price_drops()

Complete script

Here is the full, self-contained price tracker. Copy it, add your API key and product URLs, and run it.

track.py
#!/usr/bin/env python3
#A8D4A0">""#A8D4A0">"
Competitor Price Tracker
Usage: python3 track.py
Schedule: crontab -e -> 0 8,18 * * * cd /path && python3 track.py
"#A8D4A0">""
#E8A0BF">import sqlite3
#E8A0BF">import requests
#E8A0BF">from datetime #E8A0BF">import datetime
#E8A0BF">import json

API_KEY = #A8D4A0">"sr_live_YOUR_KEY"

PRODUCTS = [
    {
        #A8D4A0">"url": #A8D4A0">"https://www.competitor-a.com/product/123",
        #A8D4A0">"selectors": {#A8D4A0">"name": #A8D4A0">"h1", #A8D4A0">"price": #A8D4A0">".price"}
    },
    {
        #A8D4A0">"url": #A8D4A0">"https://www.competitor-b.com/item/456",
        #A8D4A0">"selectors": {#A8D4A0">"name": #A8D4A0">".product-name", #A8D4A0">"price": #A8D4A0">"#current-price"}
    },
]

#E8A0BF">def init_db():
    db = sqlite3.#87CEEB">connect(#A8D4A0">'prices.db')
    db.#87CEEB">execute(#A8D4A0">''#A8D4A0">'CREATE TABLE IF NOT EXISTS prices (
        id INTEGER PRIMARY KEY AUTOINCREMENT,
        product_url TEXT, product_name TEXT,
        price REAL, checked_at TEXT
    )'#A8D4A0">'')
    db.#87CEEB">commit()
    #E8A0BF">return db

#E8A0BF">def scrape_price(product):
    resp = requests.#87CEEB">post(
        #A8D4A0">"https://api.snaprender.dev/v1/extract",
        headers={#A8D4A0">"x-api-key": API_KEY},
        json={
            #A8D4A0">"url": product[#A8D4A0">"url"],
            #A8D4A0">"selectors": product[#A8D4A0">"selectors"],
            #A8D4A0">"use_flaresolverr": #E8A0BF">True
        }
    )
    #E8A0BF">return resp.#87CEEB">json().#87CEEB">get(#A8D4A0">"data", {})

#E8A0BF">def parse_price(price_str):
    #E8A0BF">if #E8A0BF">not price_str:
        #E8A0BF">return 0.0
    #E8A0BF">return float(
        price_str.replace(#A8D4A0">"$",#A8D4A0">"").replace(#A8D4A0">",",#A8D4A0">"").strip()
    )

#E8A0BF">def main():
    db = init_db()
    now = datetime.now().#87CEEB">strftime(#A8D4A0">'%Y-%m-%d %H:%M:%S')
    #E8A0BF">print(f#A8D4A0">"Price check at {now}")

    #E8A0BF">for product #E8A0BF">in PRODUCTS:
        data = scrape_price(product)
        price = parse_price(data.#87CEEB">get(#A8D4A0">"price"))
        name = data.#87CEEB">get(#A8D4A0">"name", #A8D4A0">"Unknown")

        db.#87CEEB">execute(
            #A8D4A0">"INSERT INTO prices VALUES (NULL,?,?,?,?)",
            (product[#A8D4A0">"url"], name, price, now)
        )
        #E8A0BF">print(f#A8D4A0">"  {name}: $" + f#A8D4A0">"{price:.2f}")

    db.#87CEEB">commit()
    db.#87CEEB">close()

#E8A0BF">if __name__ == #A8D4A0">"__main__":
    main()

Start tracking prices today

Get your API key in 30 seconds. 100 free requests/month — enough to track 3 products daily for a full month. No credit card required.

Get Your API Key

Frequently asked questions

For most e-commerce products, once or twice daily is sufficient. More frequent checks waste API requests without adding meaningful data. For high-velocity markets (airline tickets, crypto), you might check hourly. SnapRender's $9/mo plan gives you 1,500 requests — enough to track 50 products twice daily for an entire month.

Yes. Add use_flaresolverr: true to your SnapRender request. This routes through a real browser session that passes Cloudflare challenges. No proxy management needed on your end.

For tracking up to a few thousand products, SQLite is excellent — zero setup, no server to manage, and fast reads. If you grow beyond 10,000+ products with concurrent writes, consider PostgreSQL. But start simple.

This is the main maintenance burden of price tracking. SnapRender's markdown extraction (format: "markdown") is more resilient than CSS selectors because it captures the full page content. You can then parse prices from the markdown with regex or an LLM, which survives layout changes better than selectors.