Rank tracking is the bread and butter of SEO. Tools like ProRankTracker and Keyword.com are excellent—they offer beautiful dashboards, daily updates, and white-label reports.
But they also charge a premium. Monitoring just 500 keywords can cost you $50/month.
What if I told you that the underlying data for those 500 keywords costs less than $0.30 to fetch via API?
In this guide, we will build a “Minimum Viable Rank Tracker” using Python and SearchCans that gives you the same accuracy for 1/100th of the price.
The Economics of Rank Tracking
Most SaaS rank trackers are essentially wrappers around a SERP API.
- They take your keyword.
- They send it to Google via an API/proxy.
- They parse the position of your URL.
- They charge you a markup for the UI.
If you don’t need the fancy UI, you can skip the markup. With SearchCans charging $0.56 per 1,000 searches, tracking 500 keywords daily costs:
500 × 30 days = 15,000 requests
15 × $0.56 = $8.40 per month
Compare that to the $100+ you’d pay for an enterprise plan on other platforms. For more cost comparisons, see our detailed pricing analysis.
Build It Yourself (Python Code)
Here is a script that checks a list of keywords and saves your ranking position to a CSV file.
Basic Rank Checker
import requests
import csv
import datetime
API_KEY = "YOUR_SEARCHCANS_KEY"
MY_DOMAIN = "searchcans.com"
KEYWORDS = ["serp api pricing", "google search api", "best rank tracker"]
def check_rank(keyword):
url = "https://www.searchcans.com/api/search"
payload = {"s": keyword, "t": "google", "d": 100} # Get top 100 results
headers = {"Authorization": f"Bearer {API_KEY}"}
try:
data = requests.post(url, json=payload, headers=headers).json()
if data['code'] == 0:
for index, item in enumerate(data['data']):
if MY_DOMAIN in item['url']:
return index + 1 # Rank found!
return 0 # Not in top 100
except:
return -1 # Error
# Run the tracker
results = []
today = datetime.date.today()
print(f"📉 Checking ranks for {today}...")
for kw in KEYWORDS:
rank = check_rank(kw)
results.append([today, kw, rank])
print(f"Keyword: {kw} | Rank: {rank}")
# Save to CSV
with open('rankings.csv', 'a', newline='') as f:
writer = csv.writer(f)
writer.writerows(results)
Advanced: Multi-Domain Tracking
Track your site and competitors:
def check_multiple_domains(keyword, domains):
url = "https://www.searchcans.com/api/search"
payload = {"s": keyword, "t": "google", "d": 100}
headers = {"Authorization": f"Bearer {API_KEY}"}
response = requests.post(url, json=payload, headers=headers)
data = response.json()
rankings = {domain: 0 for domain in domains}
if data['code'] == 0:
for index, item in enumerate(data['data']):
for domain in domains:
if domain in item['url'] and rankings[domain] == 0:
rankings[domain] = index + 1
return rankings
# Usage
competitors = ["searchcans.com", "serpapi.com", "zenserp.com"]
results = check_multiple_domains("serp api", competitors)
print(results)
# Output: {'searchcans.com': 5, 'serpapi.com': 3, 'zenserp.com': 12}
Building a Complete Tracker
1. Database Storage
Store historical data for trend analysis:
import sqlite3
from datetime import datetime
def init_database():
conn = sqlite3.connect('ranks.db')
c = conn.cursor()
c.execute('''
CREATE TABLE IF NOT EXISTS rankings (
id INTEGER PRIMARY KEY,
date TEXT,
keyword TEXT,
domain TEXT,
rank INTEGER,
url TEXT,
title TEXT
)
''')
conn.commit()
conn.close()
def save_ranking(keyword, domain, rank, url, title):
conn = sqlite3.connect('ranks.db')
c = conn.cursor()
c.execute('''
INSERT INTO rankings (date, keyword, domain, rank, url, title)
VALUES (?, ?, ?, ?, ?, ?)
''', (datetime.now().date(), keyword, domain, rank, url, title))
conn.commit()
conn.close()
def get_rank_history(keyword, domain, days=30):
conn = sqlite3.connect('ranks.db')
c = conn.cursor()
c.execute('''
SELECT date, rank FROM rankings
WHERE keyword=? AND domain=?
ORDER BY date DESC
LIMIT ?
''', (keyword, domain, days))
results = c.fetchall()
conn.close()
return results
2. Trend Analysis
def analyze_trends(keyword, domain):
history = get_rank_history(keyword, domain, 30)
if len(history) < 2:
return "Insufficient data"
current_rank = history[0][1]
previous_rank = history[1][1]
if current_rank < previous_rank:
change = previous_rank - current_rank
return f"⬆️ Improved by {change} positions"
elif current_rank > previous_rank:
change = current_rank - previous_rank
return f"⬇️ Dropped by {change} positions"
else:
return "➡️ No change"
# Calculate average rank over time
def get_average_rank(keyword, domain, days=30):
history = get_rank_history(keyword, domain, days)
ranks = [r[1] for r in history if r[1] > 0]
return sum(ranks) / len(ranks) if ranks else 0
3. Alerting System
import smtplib
from email.mime.text import MIMEText
def send_alert(keyword, old_rank, new_rank):
if new_rank == 0:
subject = f"🚨 ALERT: {keyword} dropped out of top 100!"
elif new_rank > old_rank + 5:
subject = f"⚠️ WARNING: {keyword} dropped {new_rank - old_rank} positions"
else:
return # No alert needed
msg = MIMEText(f"Keyword: {keyword}\nOld Rank: {old_rank}\nNew Rank: {new_rank}")
msg['Subject'] = subject
msg['From'] = 'alerts@yoursite.com'
msg['To'] = 'seo@yoursite.com'
with smtplib.SMTP('smtp.gmail.com', 587) as server:
server.starttls()
server.login('your_email', 'your_password')
server.send_message(msg)
def track_with_alerts(keywords, domain):
for keyword in keywords:
old_rank = get_last_rank(keyword, domain)
new_rank = check_rank(keyword, domain)
if old_rank and new_rank:
if abs(new_rank - old_rank) > 5:
send_alert(keyword, old_rank, new_rank)
save_ranking(keyword, domain, new_rank)
Scheduling Automated Checks
Using cron (Linux/Mac):
# Check ranks daily at 9 AM
0 9 * * * /usr/bin/python3 /path/to/rank_tracker.py
Using Task Scheduler (Windows):
Create a batch file run_tracker.bat:
python C:\path\to\rank_tracker.py
Schedule it to run daily.
Using Python Schedule:
import schedule
import time
def job():
print("Running rank check...")
track_keywords(KEYWORDS, MY_DOMAIN)
# Schedule daily at 9 AM
schedule.every().day.at("09:00").do(job)
while True:
schedule.run_pending()
time.sleep(60)
Visualization Dashboard
Create a simple Flask web dashboard:
from flask import Flask, render_template
import sqlite3
app = Flask(__name__)
@app.route('/')
def dashboard():
conn = sqlite3.connect('ranks.db')
c = conn.cursor()
# Get latest rankings
c.execute('''
SELECT keyword, rank, date
FROM rankings
WHERE date = (SELECT MAX(date) FROM rankings)
ORDER BY rank
''')
current_ranks = c.fetchall()
conn.close()
return render_template('dashboard.html', ranks=current_ranks)
@app.route('/keyword/<keyword>')
def keyword_detail(keyword):
history = get_rank_history(keyword, MY_DOMAIN, 90)
return render_template('keyword.html', keyword=keyword, history=history)
if __name__ == '__main__':
app.run(debug=True)
Enhancing Your Tracker
1. Location-Specific Tracking
def check_rank_by_location(keyword, location):
payload = {
"s": keyword,
"t": "google",
"d": 100,
"location": location # e.g., "New York, US"
}
# ... rest of the code
2. Mobile vs Desktop Rankings
def check_mobile_rank(keyword):
payload = {
"s": keyword,
"t": "google",
"d": 100,
"device": "mobile"
}
# ... rest of the code
3. SERP Features Tracking
def track_serp_features(keyword):
data = get_serp_data(keyword)
features = {
'featured_snippet': False,
'people_also_ask': False,
'local_pack': False,
'knowledge_panel': False
}
for result in data:
if result.get('type') == 'featured_snippet':
features['featured_snippet'] = True
# ... check other features
return features
For more on building SEO tools, see our comprehensive guide.
Cost Comparison
| Solution | Monthly Cost (500 keywords) | Features |
|---|---|---|
| Ahrefs | $99+ | Full suite, UI, reports |
| SEMrush | $119+ | Full suite, UI, reports |
| ProRankTracker | $49+ | Rank tracking only |
| DIY with SearchCans | $8.40 | Custom code, full control |
Automated Reporting
Generate weekly reports:
from datetime import datetime, timedelta
import matplotlib.pyplot as plt
def generate_weekly_report(keywords, domain):
report = []
for keyword in keywords:
history = get_rank_history(keyword, domain, 7)
if history:
current = history[0][1]
week_ago = history[-1][1] if len(history) >= 7 else current
change = week_ago - current
report.append({
'keyword': keyword,
'current_rank': current,
'change': change,
'trend': '⬆️' if change > 0 else ('⬇️' if change < 0 else '➡️')
})
return report
# Visualize with matplotlib
def plot_rank_history(keyword, domain):
history = get_rank_history(keyword, domain, 30)
dates = [h[0] for h in history]
ranks = [h[1] for h in history]
plt.figure(figsize=(10, 6))
plt.plot(dates, ranks, marker='o')
plt.title(f'Rank History: {keyword}')
plt.xlabel('Date')
plt.ylabel('Rank')
plt.gca().invert_yaxis() # Lower rank is better
plt.grid(True)
plt.savefig(f'{keyword}_history.png')
For no-code solutions, check out our Google Sheets automation guide.
Conclusion
SaaS tools are great for convenience, but for raw data efficiency, nothing beats a custom script connected to a wholesale API.
Start building your SEO toolkit today. For more advanced features, explore our documentation or check out our pricing for transparent rates.