How to Diagnose the Cause of Low Success Rate: Step-by-Step Guide
Success rate is the percentage of successful requests from the total number of attempts. When this metric drops below normal, you lose money, time, and data. But there can be dozens of causes: from incorrect configuration to blocks by the target server. In this article, we'll break down a systematic approach to diagnostics and find a solution.
What is success rate and what is the norm
Success rate (SR) = (Successful requests / Total number of requests) × 100%
Normal values depend on the type of task:
| Task | Normal SR | Critical Level |
|---|---|---|
| Parsing public data | 95–99% | below 85% |
| SMM automation | 90–97% | below 80% |
| Ad verification | 98–99.5% | below 95% |
| API integration | 99–99.9% | below 98% |
If SR drops by 5–10% from your baseline, it's a signal to diagnose. If it drops by 20%+ — urgent action is needed.
First steps of diagnostics
Step 1: Check logs and metrics
Collect data for the last 24–72 hours:
- When exactly did SR drop? (exact time)
- What percentage of requests return error 407 (Proxy Authentication Required)?
- What percentage — 429 (Too Many Requests)?
- What percentage — timeouts (connection timeout)?
- Did the load change? (RPS — requests per second)?
Step 2: Test in isolation
Use a simple script to test the proxy without your application:
import requests
import time
proxy = "http://proxy_ip:port"
proxies = {"http": proxy, "https": proxy}
target_url = "https://httpbin.org/ip"
success = 0
failed = 0
for i in range(100):
try:
response = requests.get(
target_url,
proxies=proxies,
timeout=10,
verify=False
)
if response.status_code == 200:
success += 1
print(f"✓ Attempt {i+1}: success")
else:
failed += 1
print(f"✗ Attempt {i+1}: status {response.status_code}")
except Exception as e:
failed += 1
print(f"✗ Attempt {i+1}: {str(e)}")
time.sleep(0.5)
sr = (success / (success + failed)) * 100
print(f"\nSuccess Rate: {sr:.1f}%")
print(f"Successful: {success}, Errors: {failed}")
If this test shows normal SR — the problem is in your code or configuration. If SR is low even here — the problem is in the proxy or target server.
Issues on the proxy side
Error 407: Proxy Authentication Required
Causes:
- Incorrect credentials (login/password)
- Account expired
- IP address not in whitelist (if required)
- IP rotation not working or disabled
Solution:
import requests
# Correct format for residential proxies
proxy = "http://login:password@proxy-host:port"
proxies = {"http": proxy, "https": proxy}
# Test
response = requests.get("https://httpbin.org/ip", proxies=proxies, timeout=10)
print(response.text)
Proxy server overload
If all service users send a huge number of requests simultaneously, there may be a limit on RPS (requests per second). This is rare, but it happens.
Check:
- Your current RPS at peak
- Limits of your plan
- Are there 429 errors in the logs
Solution: add delay between requests or upgrade your plan.
IP address quality
For residential proxies low SR may mean that you're being rotated blocked addresses. Check:
- What percentage of IP addresses return 403 Forbidden?
- Do the same addresses repeat?
- Is there a pattern — one country/region works, another doesn't?
Blocks and filters of the target server
Error 429: Too Many Requests
The target server sees too many requests from one IP or in general. Solutions:
- Add delay: `time.sleep(random.uniform(1, 3))`
- Use IP rotation: each request — new IP
- Lower RPS: send requests sequentially, not in parallel
- Add realistic headers: User-Agent, Referer, Accept-Language
Error 403 Forbidden
The server blocked your IP (or proxy IP). This could be:
- Geolocation filter
- Proxy service blacklist
- Bot detector (JavaScript, CAPTCHA)
Solution: use mobile proxies or residential proxies with rotation. They are harder to block.
Error 403: User-Agent Check
Some services reject requests with suspicious User-Agent:
import requests
import random
user_agents = [
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36",
"Mozilla/5.0 (iPhone; CPU iPhone OS 15_0 like Mac OS X) AppleWebKit/605.1.15"
]
headers = {
"User-Agent": random.choice(user_agents),
"Accept-Language": "en-US,en;q=0.9",
"Accept": "text/html,application/xhtml+xml",
"Referer": "https://google.com"
}
response = requests.get(
"https://target-site.com",
headers=headers,
proxies={"http": proxy, "https": proxy},
timeout=10
)
print(response.status_code)
Errors in client code
Incorrect exception handling
Common mistake: code treats connection error as a failed request but doesn't try to reconnect:
import requests
from requests.adapters import HTTPAdapter
from urllib3.util.retry import Retry
# Correct way with retries
session = requests.Session()
retry_strategy = Retry(
total=3,
backoff_factor=1,
status_forcelist=[429, 500, 502, 503, 504],
allowed_methods=["GET", "POST"]
)
adapter = HTTPAdapter(max_retries=retry_strategy)
session.mount("http://", adapter)
session.mount("https://", adapter)
try:
response = session.get(url, proxies=proxies, timeout=10)
except requests.exceptions.RequestException as e:
print(f"Error: {e}")
# Log and move to next request
Incorrect timeouts
If timeout is too short (1–2 seconds), slow proxies will be rejected:
- For regular parsing: 10–30 seconds
- For mobile proxies: 15–45 seconds
- For API: 5–10 seconds
SSL/TLS errors
If you use `verify=False`, it can mask real issues. Better to update certificates:
import requests
import certifi
# Correct
response = requests.get(
url,
proxies=proxies,
verify=certifi.where(), # Instead of verify=False
timeout=15
)
Network issues and timeouts
Connection timeout vs Read timeout
The difference is important:
- Connection timeout: proxy not responding (proxy or network issue)
- Read timeout: target server sending data slowly (target server issue)
import requests
# timeout = (connection_timeout, read_timeout)
try:
response = requests.get(
url,
proxies=proxies,
timeout=(5, 15) # 5 sec to connect, 15 to read
)
except requests.exceptions.ConnectTimeout:
print("Proxy not responding")
except requests.exceptions.ReadTimeout:
print("Target server is slow")
DNS issues
If the target server doesn't resolve, it's not a proxy error:
import socket
# Check DNS outside proxy
try:
ip = socket.gethostbyname("target-site.com")
print(f"Resolves to: {ip}")
except socket.gaierror:
print("DNS error — site not found")
Diagnostics checklist for low SR
- Establish baseline: what was normal SR before?
- Run isolated test (script above) with 100 requests
- Check logs: which HTTP codes dominate? (407, 429, 403, timeouts?)
- If 407: check login/password and IP whitelist
- If 429: add delay between requests, use IP rotation
- If 403: check User-Agent, Referer, add realistic headers
- If timeouts: increase timeout, check RPS, use retry logic
- Check your code: correct exception handling, correct timeouts
- Check target server: is it accessible directly (without proxy)?
- If nothing else helps: try a different proxy type or different provider
Quick diagnostics table
| HTTP Code | Probable Cause | Solution |
|---|---|---|
| 407 | Incorrect proxy credentials | Check login/password, IP whitelist |
| 429 | Too many requests | Add delay, use IP rotation |
| 403 | IP blocked or bot detector | Add realistic headers, use mobile proxies |
| Timeout | Slow proxy or overloaded target server | Increase timeout, check RPS |
| Connection refused | Proxy server unavailable | Check IP:port, proxy status |
Summary
Low success rate is a symptom, not a disease. There can be many causes: from a typo in code to blocking by the target server. Systematic diagnostics is the key to solving it:
- Check metrics and logs
- Isolate the problem (proxy vs target server vs your code)
- Determine the type of error (407, 429, 403, timeout)
- Apply the appropriate solution
For tasks requiring reliability and high SR, residential proxies with IP rotation are recommended. They are harder to detect and more stable. Try a free test on proxycove.com and test it on your task.