Skip to main content
If your proxy requests are slower than expected, use this guide to identify and resolve performance bottlenecks.

Check your baseline

First, test raw proxy latency:
curl -x http://myuser:mypass@192.168.1.1:3128 -w "DNS: %{time_namelookup}s\nConnect: %{time_connect}s\nTLS: %{time_appconnect}s\nTotal: %{time_total}s\n" -o /dev/null -s https://httpbin.org/ip
Stat Proxies are located in Ashburn, VA. Expected latency to most US targets is under 100ms.

Common causes and fixes

Symptom: time_namelookup is above 200ms.Fix:
  • Use a fast DNS resolver (Google: 8.8.8.8, Cloudflare: 1.1.1.1)
  • Cache DNS lookups in your application
  • Resolve target hostnames once and reuse the IP
Symptom: Each request takes 200ms+ even for the same target.Fix:
  • Use persistent connections (HTTP keep-alive)
  • In Python, use requests.Session() instead of standalone requests.get()
  • In Node.js, use an HTTP agent with keepAlive: true
Symptom: Requests queue up and become slow under load.Fix:
  • Distribute concurrent requests across multiple proxy IPs
  • Limit concurrency per proxy to 5–10 simultaneous connections
  • Use connection pooling in your HTTP client
Symptom: The same proxy is fast to httpbin.org but slow to your target.Fix:
  • This is likely the target’s response time, not your proxy
  • Test by comparing curl to the target both with and without the proxy
  • If the target is throttling, add delays between requests
Symptom: Consistent 200ms+ latency.Fix:
  • Our proxies are in Ashburn, VA — closest to US East Coast targets
  • Latency to West Coast or international targets will naturally be higher
  • For latency-sensitive workloads targeting US infrastructure, this location is optimal

Performance best practices

  1. Reuse connections — Use sessions/connection pooling instead of creating new connections per request
  2. Distribute load — Spread requests across your proxy pool rather than hammering a single IP
  3. Set timeouts — Use 10–30 second timeouts to avoid hanging on slow targets
  4. Limit concurrency — 5–10 concurrent connections per proxy IP is a safe default
  5. Compress responses — Send Accept-Encoding: gzip to reduce transfer sizes

Connection Errors

Diagnose connectivity failures

Blocked Requests

Handle blocks and improve success rates