← Back to Blog

How to Bypass PerimeterX and Akamai Detection

We analyze the mechanisms of PerimeterX and Akamai, study their detection methods, and create an effective bypass strategy using proxies, browser fingerprints, and behavior emulation.

πŸ“…December 23, 2025
```html

Bypassing PerimeterX and Akamai Protection: Practical Anti-detect Methods

PerimeterX and Akamai Bot Manager are two of the most advanced bot protection solutions used by major e-commerce platforms, financial services, and corporate websites. These systems analyze hundreds of parameters related to the browser, user behavior, and network characteristics, creating a multi-layered defense that cannot be bypassed simply by changing the IP address.

In this guide, we will thoroughly examine the architecture of both systems, explore their detection methods, and create a comprehensive bypass strategy based on real cases and technical experiments.

Architecture of PerimeterX and Akamai: How Detection Works

PerimeterX (now HUMAN Security) and Akamai Bot Manager function as multi-layered protection systems, integrating at various stages of request processing. Understanding their architecture is critically important for developing a bypass strategy.

Architecture of PerimeterX

PerimeterX operates in three stages. In the first stage, a JavaScript sensor is embedded in the HTML page and executed in the client's browser, collecting data about the execution environment: WebGL fingerprint, Canvas fingerprint, audio context, available fonts, plugins, screen resolution, and many other parameters. This sensor is obfuscated and regularly updated, making its analysis difficult.

In the second stage, the server component of PerimeterX analyzes HTTP headers, TLS fingerprints, IP reputation, and network characteristics even before the request reaches the main application. The system uses its own database of known bots and suspicious IP addresses, updated in real-time.

The third stage involves behavioral analysis. PerimeterX tracks mouse movements, scroll speed, click patterns, time between actions, and builds a behavior profile. Machine learning models compare this profile with patterns of real users and known bots.

Architecture of Akamai Bot Manager

Akamai Bot Manager integrates at the CDN level, giving it an advantage in analysis speed. The system uses its own BMP (Bot Manager Premier) technology, which analyzes requests at Akamai's edge servers before they are forwarded to the origin server.

A key difference for Akamai is the use of telemetry from millions of websites protected by their CDN. This allows the system to quickly identify new types of bots and update detection rules globally. Akamai also uses a Web SDK technology similar to PerimeterX's sensor but focuses on cryptographic client integrity verification.

Important: Both systems use cookies with encrypted data about the results of checks. These cookies cannot be forged without knowledge of the server key, so simply copying cookies between sessions does not work.

Detection Methods: What Protection Systems Analyze

Modern anti-bot systems analyze requests based on hundreds of parameters grouped into several categories. Understanding each category allows for systematic elimination of automation markers.

HTTP Header and TLS Analysis

The order of HTTP headers is one of the simplest ways to detect automation. Browsers send headers in a strictly defined order, which varies between versions and manufacturers. Libraries like requests in Python or axios in Node.js use their own order, instantly revealing automation.

The TLS fingerprint (JA3 fingerprint) is created from the parameters of the TLS handshake: TLS version, list of supported ciphers, extensions, and their order. Each combination of browser and operating system creates a unique fingerprint. For example, Chrome 120 on Windows 11 has a fingerprint different from Chrome 120 on macOS or from Firefox on the same system.

// Example of mismatch between User-Agent and TLS fingerprint
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) Chrome/120.0.0.0
TLS Fingerprint: JA3 hash corresponds to Python requests
// Result: instant block

JavaScript Execution Environment

Headless browsers leave dozens of traces in the JavaScript environment. Properties like navigator.webdriver, the presence of window.chrome objects, inconsistencies in navigator.plugins, anomalies in WebGL and Canvas rendering are checked.

PerimeterX and Akamai use advanced verification techniques: they measure the execution time of JavaScript functions (which differs in headless browsers), check for automation artifacts in the DOM, and analyze the call stack of functions. The systems also check data consistency β€” for example, if the User-Agent indicates a mobile device but navigator.maxTouchPoints equals 0, this triggers detection.

Network Characteristics and IP Reputation

Protection systems check the IP address based on numerous parameters: affiliation with known proxy providers, presence on blacklists, activity history from that IP, consistency of geolocation with other request parameters (browser language, time zone).

Special attention is given to IP usage patterns. If requests from one address come with different User-Agents or browser fingerprints, this is a strong signal of automation. Similarly, if the IP changes too frequently within a single session (aggressive proxy rotation), this triggers a block.

Detection Parameter PerimeterX Akamai
TLS fingerprinting High priority Critical priority
Canvas fingerprint Medium priority High priority
Behavioral analysis Critical priority High priority
IP reputation High priority High priority
HTTP/2 fingerprint Medium priority Critical priority

Browser Fingerprinting and TLS Fingerprints

Browser fingerprinting is a technique for creating a unique identifier for a browser based on its characteristics. Even without cookies, protection systems can track users and identify anomalies.

Canvas and WebGL Fingerprinting

Canvas fingerprinting works by rendering an invisible image with text and graphics. Due to differences in graphics drivers, fonts, and anti-aliasing settings, each system creates a slightly different image. The hash of this image becomes part of the fingerprint.

WebGL fingerprinting uses 3D rendering to create an even more unique fingerprint. The system requests information about the GPU, supported extensions, maximum texture sizes, and other parameters. The combination of this data creates a fingerprint with sufficient entropy for device identification.

// Example of obtaining a WebGL fingerprint
const canvas = document.createElement('canvas');
const gl = canvas.getContext('webgl');
const debugInfo = gl.getExtension('WEBGL_debug_renderer_info');
const vendor = gl.getParameter(debugInfo.UNMASKED_VENDOR_WEBGL);
const renderer = gl.getParameter(debugInfo.UNMASKED_RENDERER_WEBGL);

// Result may be: "Google Inc. (NVIDIA)" + "ANGLE (NVIDIA GeForce RTX 3080)"
// Unique combination for each device

Audio Context and Fonts

The Audio Context API allows for creating a unique fingerprint based on sound processing. Differences in the audio stack of the operating system lead to microscopic differences in audio signal processing that can be measured and used for identification.

The list of installed fonts is also unique to each system. Protection systems use the technique of measuring text sizes with different fonts β€” if the font is not installed, the browser uses a fallback, which changes the sizes. Checking hundreds of fonts creates a unique signature.

TLS and HTTP/2 Fingerprinting

The JA3 fingerprint is created from the parameters of the TLS Client Hello: SSL/TLS version, list of cipher suites, list of extensions, list of supported elliptic curve groups, and formats of elliptic curve points. These parameters are concatenated and hashed, creating a unique string.

The HTTP/2 fingerprint analyzes the SETTINGS frame parameters, the order and priorities of streams, and window update values. Each browser uses unique HTTP/2 settings, allowing for client identification even with a correct TLS fingerprint.

Practical Tip: To bypass fingerprinting, it is essential to ensure consistency across all parameters. Using a Chrome User-Agent with a Firefox TLS fingerprint is instantly detectable. Tools like curl-impersonate or libraries like tls-client help create a fully consistent fingerprint.

Behavioral Analysis and Machine Learning

Behavioral analysis is the most challenging aspect to bypass in modern anti-bot systems. Even with a perfect technical fingerprint, non-human behavior will reveal automation.

Mouse Movements and Interactions Analysis

PerimeterX and Akamai track mouse movement trajectories, acceleration and deceleration, micro-movements characteristic of a human hand. Bots typically move the cursor in straight lines or do not generate mouse events at all. The systems also analyze reaction time β€” clicking immediately after the page loads without mouse movement looks suspicious.

Scrolling patterns are also unique. A human scrolls a page unevenly: quickly at first, slows down to read, sometimes scrolls back. Bots usually scroll at a constant speed or use window.scrollTo() for instant scrolling.

Timing Patterns and Action Speed

The time between actions is a critical parameter. A human cannot fill out a 10-field form in 0.5 seconds or click on 50 links in a minute. Protection systems build a speed profile for each type of action and compare it with user behavior.

Special attention is paid to the consistency of delays. If exactly 2 seconds pass between each click, this is an obvious sign of sleep(2000) in the code. Human delays have natural variability and follow certain statistical distributions.

Machine Learning Models

Both systems use ML models trained on millions of sessions from real users and known bots. The models analyze hundreds of features simultaneously: action sequences, site browsing depth, navigation patterns, and interaction with elements.

PerimeterX uses an ensemble of models with different weights for different types of sites. The model for e-commerce focuses on shopping patterns, while the model for media sites focuses on content reading patterns. This makes bypassing more complex, as it requires adaptation to the specifics of each site.

// Example of human-like delays with variability
function humanDelay(baseMs) {
  // Log-normal distribution instead of uniform
  const variance = baseMs * 0.3;
  const delay = baseMs + (Math.random() - 0.5) * variance;
  // Adding micro-delays characteristic of browser event processing
  const microDelay = Math.random() * 50;
  return Math.max(100, delay + microDelay);
}

// Usage: await new Promise(r => setTimeout(r, humanDelay(2000)));

Proxy Selection and Rotation Strategy

Choosing the type of proxy and rotation strategy is critically important when working with PerimeterX and Akamai. Incorrect proxy configuration can nullify all efforts to mask the browser fingerprint.

Residential vs Mobile vs Datacenter Proxies

Datacenter proxies have the lowest cost but also the highest risk of detection. PerimeterX and Akamai maintain databases of datacenter IP addresses and automatically increase the level of scrutiny for such requests. Using datacenter proxies is only possible for low-priority tasks or in combination with a very high-quality browser fingerprint.

Residential proxies use IP addresses from real internet service providers, significantly reducing the likelihood of detection. However, the quality of residential proxies varies greatly. It is important to choose providers with clean IP pools, where addresses have not been previously used for spam or other suspicious activities.

Mobile proxies provide the highest level of trust, as they use IP addresses from mobile operators. These addresses are typically shared among many users (carrier-grade NAT), making blocking difficult. Mobile proxies are particularly effective against Akamai, which is more cautious about blocking mobile traffic.

Rotation Strategies

Aggressive rotation (changing IP for each request) is a common mistake. This creates a suspicious pattern: one user cannot physically change their IP address every few seconds. Session rotation, where one IP is used for an entire user session (10-30 minutes of activity), is more effective.

For long operations, sticky sessions lasting 30-60 minutes are recommended. This simulates the behavior of a real user who stays on one IP during a session. It is also important not to use one IP for too long β€” sessions lasting several hours also look suspicious.

Geographical Consistency

It is critically important to ensure consistency between the geolocation of the IP address and other parameters: browser language, time zone, locale settings. If the IP address is from Germany, but navigator.language returns "en-US" and the time zone is "America/New_York" β€” this is an instant detection trigger.

When working across multiple geographical regions, use separate browser profiles for each region. Switching between regions within a single session (IP from France, then from Japan) is impossible for a real user and is immediately detected.

Proxy Type Effectiveness Against PerimeterX Effectiveness Against Akamai Recommendations
Datacenter Low (30-40%) Very low (20-30%) Only for testing
Residential High (75-85%) Medium (65-75%) Main choice for most tasks
Mobile Very high (85-95%) High (80-90%) For critical tasks and highly protected sites

Configuring Anti-detect Browsers and Tools

Proper configuration of automation tools is a key factor for successfully bypassing PerimeterX and Akamai. Even the best proxies will not help if the browser fingerprint contains obvious automation markers.

Playwright and Puppeteer: Advanced Configuration

A basic installation of Playwright or Puppeteer creates an obvious headless browser. It is necessary to use stealth plugins and additional configuration to mask automation. The puppeteer-extra-plugin-stealth library hides the main markers but requires additional setup.

// Advanced Playwright configuration with anti-detect
const { chromium } = require('playwright-extra');
const stealth = require('puppeteer-extra-plugin-stealth')();

chromium.use(stealth);

const browser = await chromium.launch({
  headless: false, // Headless mode is easily detected
  args: [
    '--disable-blink-features=AutomationControlled',
    '--disable-features=IsolateOrigins,site-per-process',
    '--disable-site-isolation-trials',
    '--no-sandbox',
    '--disable-setuid-sandbox',
    '--disable-dev-shm-usage',
    '--disable-accelerated-2d-canvas',
    '--disable-gpu',
    '--window-size=1920,1080',
    '--user-agent=Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'
  ]
});

const context = await browser.newContext({
  viewport: { width: 1920, height: 1080 },
  locale: 'en-US',
  timezoneId: 'America/New_York',
  permissions: ['geolocation', 'notifications'],
  geolocation: { latitude: 40.7128, longitude: -74.0060 }
});

Selenium with Undetected-Chromedriver

The standard Selenium WebDriver is easily detected through the navigator.webdriver property. The undetected-chromedriver library automatically patches ChromeDriver, removing the main automation markers and is regularly updated to bypass new detection methods.

import undetected_chromedriver as uc
from selenium.webdriver.chrome.options import Options

options = Options()
options.add_argument('--disable-blink-features=AutomationControlled')
options.add_argument('--disable-dev-shm-usage')
options.add_argument('--no-sandbox')
options.add_argument('--window-size=1920,1080')

# Using a specific version of Chrome for consistency
driver = uc.Chrome(options=options, version_main=120)

# Additional masking via CDP
driver.execute_cdp_cmd('Network.setUserAgentOverride', {
    "userAgent": 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'
})

driver.execute_script("Object.defineProperty(navigator, 'webdriver', {get: () => undefined})")

Anti-detect Browsers: AdsPower, Multilogin, GoLogin

Commercial anti-detect browsers provide ready-made solutions for managing fingerprints. AdsPower and Multilogin allow you to create profiles with unique Canvas, WebGL, audio fingerprints and manage them via API. These tools are especially useful when working with multiple accounts.

A key advantage is the ability to maintain a consistent fingerprint between sessions. Each profile has fixed parameters for Canvas, WebGL, fonts, which is critical for long-term operations. It is important to use realistic configurations β€” generating random fingerprints can create technically impossible combinations that are easily detected.

HTTP Clients with Correct Fingerprints

For tasks that do not require JavaScript rendering, HTTP clients with correct TLS and HTTP/2 fingerprints are more effective. Libraries like curl-impersonate (for Python β€” curl_cffi) and tls-client allow you to mimic the TLS fingerprints of real browsers.

from curl_cffi import requests

# Mimicking Chrome 120 with correct TLS and HTTP/2 fingerprint
response = requests.get(
    'https://example.com',
    impersonate="chrome120",
    proxies={
        "http": "http://user:pass@proxy:port",
        "https": "http://user:pass@proxy:port"
    },
    headers={
        'Accept-Language': 'en-US,en;q=0.9',
        'Accept-Encoding': 'gzip, deflate, br',
        'sec-ch-ua': '"Not_A Brand";v="8", "Chromium";v="120"',
        'sec-ch-ua-mobile': '?0',
        'sec-ch-ua-platform': '"Windows"'
    }
)

# TLS fingerprint automatically matches Chrome 120

Automation Techniques Without Detection Triggers

Even with a perfect technical fingerprint, automation patterns can reveal a bot. It is necessary to imitate human behavior at the level of interaction with the site.

Emulating Mouse Movements

Straightforward movement of the mouse from point A to point B is a clear sign of automation. A human hand creates smooth curves with micro-corrections. Libraries like pyautogui allow for generating realistic trajectories using BΓ©zier curves.

// Generating a human-like mouse trajectory
async function humanMouseMove(page, targetX, targetY) {
  const current = await page.evaluate(() => ({
    x: window.mouseX || 0,
    y: window.mouseY || 0
  }));
  
  const steps = 25 + Math.floor(Math.random() * 15);
  const points = generateBezierCurve(current.x, current.y, targetX, targetY, steps);
  
  for (let point of points) {
    await page.mouse.move(point.x, point.y);
    await new Promise(r => setTimeout(r, 10 + Math.random() * 20));
  }
  
  // Micro-corrections before clicking
  await page.mouse.move(targetX + (Math.random() - 0.5) * 2, 
                        targetY + (Math.random() - 0.5) * 2);
}

function generateBezierCurve(x1, y1, x2, y2, steps) {
  const cp1x = x1 + (x2 - x1) * (0.3 + Math.random() * 0.2);
  const cp1y = y1 + (y2 - y1) * (0.3 + Math.random() * 0.2);
  const points = [];
  
  for (let i = 0; i <= steps; i++) {
    const t = i / steps;
    const x = Math.pow(1-t, 2) * x1 + 2 * (1-t) * t * cp1x + Math.pow(t, 2) * x2;
    const y = Math.pow(1-t, 2) * y1 + 2 * (1-t) * t * cp1y + Math.pow(t, 2) * y2;
    points.push({x: Math.round(x), y: Math.round(y)});
  }
  return points;
}

Realistic Scrolling and Content Reading

A person scrolls a page to read content, stopping at interesting sections. A bot typically scrolls to the bottom of the page or to the desired element as quickly as possible. Imitating reading requires analyzing content and creating realistic pauses.

async function humanScroll(page, targetElement) {
  const elementPosition = await page.evaluate(el => {
    const rect = el.getBoundingClientRect();
    return rect.top + window.pageYOffset;
  }, targetElement);
  
  const currentScroll = await page.evaluate(() => window.pageYOffset);
  const distance = elementPosition - currentScroll;
  const scrollSteps = Math.floor(Math.abs(distance) / 100);
  
  for (let i = 0; i < scrollSteps; i++) {
    const scrollAmount = (distance / scrollSteps) * (0.8 + Math.random() * 0.4);
    await page.evaluate((amount) => {
      window.scrollBy({top: amount, behavior: 'smooth'});
    }, scrollAmount);
    
    // Random pauses for "reading"
    if (Math.random() > 0.7) {
      await new Promise(r => setTimeout(r, 1000 + Math.random() * 2000));
    } else {
      await new Promise(r => setTimeout(r, 200 + Math.random() * 400));
    }
  }
}

Natural Navigation Patterns

Users do not go directly to the target page β€” they interact with the site naturally. Start from the homepage, visit several sections, use search or navigation menus. This creates an interaction history that increases trust with protection systems.

It is also important to imitate mistakes and corrections β€” a person may click the wrong link and go back, make a mistake when entering a search field and correct a typo. A perfectly direct path to the target looks suspicious.

Managing Cookies and Storage

PerimeterX and Akamai use cookies and localStorage to track sessions. Completely clearing cookies between requests looks suspicious β€” a real browser retains some cookies (analytics, settings). Retain cookies between sessions for one "user," but use different sets of cookies for different profiles.

Important: Protection systems analyze the age of cookies. If a protective cookie (_px, _abck) has just appeared, but the user demonstrates the behavior of a regular visitor β€” this inconsistency is detected. For long-term operations, "warm up" profiles by creating a history of visits.

Practical Cases and Solutions to Common Problems

Let's consider specific scenarios for bypassing PerimeterX and Akamai with solutions to common problems that arise in the process.

Case 1: Parsing E-commerce with PerimeterX

Task: Extracting product data from a large online store protected by PerimeterX. The site blocks after 3-5 requests even from different IPs.

Solution: Use a combination of residential proxies with sticky sessions (30 minutes) and Playwright with full behavior emulation. Key points: start from the homepage, use search or categories for navigation, add random delays of 3-7 seconds between requests, imitate scrolling and mouse movements. It is critical to retain _px cookies between requests within the same session.

// Example of a session with warming up
async function scrapeWithWarmup(page, targetUrls) {
  // Warm up the profile
  await page.goto('https://example.com');
  await humanScroll(page, await page.$('footer'));
  await new Promise(r => setTimeout(r, 3000 + Math.random() * 2000));
  
  // Navigation through the menu
  await humanMouseMove(page, menuX, menuY);
  await page.click('nav a.category');
  await new Promise(r => setTimeout(r, 2000 + Math.random() * 1000));
  
  // Only after warming up do we proceed to target pages
  for (let url of targetUrls) {
    await page.goto(url);
    await humanScroll(page, await page.$('.product-info'));
    // Extracting data
    const data = await page.evaluate(() => extractProductData());
    await new Promise(r => setTimeout(r, 5000 + Math.random() * 3000));
  }
}

Case 2: Bypassing Akamai for API Requests

Task: Accessing an API protected by Akamai Bot Manager. The API requires specific headers and tokens generated by JavaScript on the page.

Solution: Akamai often uses sensor_data β€” an encrypted string with the results of browser checks. This string is generated by JavaScript and must be included in the request. Use browser automation to obtain valid sensor_data, then apply it in the HTTP client with a correct TLS fingerprint.

// Extracting sensor_data via the browser
async function getSensorData(page) {
  await page.goto('https://example.com');
  
  // Wait for Akamai sensor to execute
  await page.waitForTimeout(5000);
  
  // Extract sensor_data from cookie or localStorage
  const sensorData = await page.evaluate(() => {
    const cookie = document.cookie.split(';')
      .find(c => c.trim().startsWith('_abck='));
    return cookie ? cookie.split('=')[1] : null;
  });
  
  return sensorData;
}

// Usage in HTTP client
const sensorData = await getSensorData(page);
const response = await fetch('https://example.com/api/data', {
  headers: {
    'Cookie': `_abck=${sensorData}`,
    'User-Agent': 'Mozilla/5.0...',
    // Other headers must match the browser
  }
});

Case 3: Solving CAPTCHA and Challenge Pages

Problem: Even with the correct configuration, PerimeterX or Akamai sometimes display challenge pages or CAPTCHA for additional verification.

Solution: Challenge pages from PerimeterX typically perform additional checks that require human interaction. Implementing a solution may involve using CAPTCHA-solving services or human-in-the-loop systems to bypass these challenges effectively.

Conclusion and Recommendations

Bypassing PerimeterX and Akamai protection requires a comprehensive understanding of their detection mechanisms and careful implementation of anti-detect strategies. By employing the techniques discussed in this guide, users can enhance their chances of successfully navigating these sophisticated systems while minimizing the risk of detection.

```