Back to Blog

How to Set Up a Proxy in Serverless Applications: AWS Lambda, Vercel, Cloudflare Workers

A complete guide to integrating proxies into serverless functions: from setting up HTTP clients to bypassing rate limits and geo-blocks in AWS Lambda, Vercel Edge Functions, and Cloudflare Workers.

📅February 19, 2026

Serverless architecture has become the standard for modern web applications, but developers regularly face a problem: all requests from Lambda functions or Edge Functions come from the IP addresses of cloud providers' data centers. This leads to blocks when accessing external APIs, scraping data, or automating tasks. In this guide, we will discuss how to integrate proxies into serverless functions to bypass restrictions, rate limits, and geo-blocks.

Why Serverless Functions Need Proxies

Serverless platforms (AWS Lambda, Google Cloud Functions, Vercel, Cloudflare Workers) execute code in cloud infrastructure using data center IP addresses. This creates several critical issues for developers:

Problem 1: IP Blocking from Data Centers. Many services automatically block requests from known IP addresses of AWS, Google Cloud, or Azure. For example, when scraping e-commerce sites (Amazon, eBay, Wildberries) or social networks (Instagram API, TikTok API), your Lambda functions will receive HTTP 403 or captcha on the first request. Bot protection systems (Cloudflare, Akamai, DataDome) instantly recognize traffic from cloud data centers.

Problem 2: Rate Limiting at the IP Level. If you deploy a serverless application with thousands of simultaneous calls, all requests may come from one or several AWS IP addresses. External APIs quickly reach limits (for example, GitHub API — 60 requests/hour from one IP, Google Maps API — 100 requests/second). Even if you have paid for an extended API plan, the IP limit will still apply.

Problem 3: Geo-blocking. Serverless functions in the us-east-1 region will not be able to access content available only from Russia, Europe, or Asia. This is critical when scraping regional marketplaces (Ozon, Yandex.Market), checking ads from different countries, or testing website localization.

Problem 4: Shared IP with Other Users. In a serverless environment, your functions may receive an IP address that has already been used by other clients of the cloud provider. If someone previously abused this IP (spam, DDoS, scraping), it may be blacklisted. You will face blocking without any fault of your own.

The solution to all these problems is the integration of proxy servers. Proxies allow your serverless functions to send requests through residential or mobile IP addresses that appear as regular users. This removes blocks, bypasses rate limits, and provides access to geo-blocked content.

What Types of Proxies Are Suitable for Serverless

The choice of proxy type depends on the task of your serverless application. Let's consider three main options and their use cases:

Proxy Type Speed Anonymity Use Cases
Datacenter Proxies Very High (10-50 ms) Low API access without strict limits, service availability checks, uptime monitoring
Residential Proxies Medium (100-500 ms) High E-commerce scraping, working with social networks, bypassing Cloudflare, accessing geo-blocked content
Mobile Proxies Medium (150-600 ms) Very High Working with mobile APIs (Instagram, TikTok), testing mobile applications, bypassing the strictest protections

For most serverless applications, it is recommended to use residential proxies. They provide an optimal balance between speed and anonymity. Residential IPs appear as regular home users, allowing you to bypass bot protections and rate limits without significantly increasing latency.

Datacenter proxies are suitable only for simple tasks (checking HTTP statuses, working with public APIs without limits). Mobile proxies are needed in specific cases — when working with mobile APIs or when maximum anonymity is critically important.

Setting Up Proxies in AWS Lambda

AWS Lambda is the most popular serverless platform, and integrating proxies here requires proper HTTP client configuration. Lambda functions can use various programming languages (Node.js, Python, Go), so let's look at examples for the most common ones.

Node.js (axios)

Axios is the most popular library for HTTP requests in Node.js. To set up a proxy, use the proxy parameter in the configuration:

const axios = require('axios');

exports.handler = async (event) => {
  const proxyConfig = {
    host: 'proxy.example.com',
    port: 8080,
    auth: {
      username: 'your_username',
      password: 'your_password'
    },
    protocol: 'http'
  };

  try {
    const response = await axios.get('https://api.example.com/data', {
      proxy: proxyConfig,
      timeout: 10000 // 10 seconds
    });

    return {
      statusCode: 200,
      body: JSON.stringify(response.data)
    };
  } catch (error) {
    console.error('Proxy error:', error.message);
    return {
      statusCode: 500,
      body: JSON.stringify({ error: error.message })
    };
  }
};

Important Note: Store proxy credentials in AWS Systems Manager Parameter Store or AWS Secrets Manager, not in the code. This ensures security and allows you to easily change proxies without rebuilding the function.

Python (requests)

In Python, the requests library is used to work with proxies using the proxies parameter:

import requests
import json

def lambda_handler(event, context):
    proxies = {
        'http': 'http://username:password@proxy.example.com:8080',
        'https': 'http://username:password@proxy.example.com:8080'
    }
    
    try:
        response = requests.get(
            'https://api.example.com/data',
            proxies=proxies,
            timeout=10
        )
        
        return {
            'statusCode': 200,
            'body': json.dumps(response.json())
        }
    except requests.exceptions.RequestException as e:
        print(f'Proxy error: {str(e)}')
        return {
            'statusCode': 500,
            'body': json.dumps({'error': str(e)})
        }

For SOCKS5 proxies (a more secure protocol) in Python, you need to install an additional library requests[socks] and change the URL format:

proxies = {
    'http': 'socks5://username:password@proxy.example.com:1080',
    'https': 'socks5://username:password@proxy.example.com:1080'
}

Optimization for Cold Starts

Lambda functions have a cold start problem — the first request after a period of inactivity takes 1-3 seconds. When using proxies, this time increases. To minimize delays, create the HTTP client outside the handler function:

const axios = require('axios');

// Create the client once during container initialization
const httpClient = axios.create({
  proxy: {
    host: 'proxy.example.com',
    port: 8080,
    auth: {
      username: process.env.PROXY_USER,
      password: process.env.PROXY_PASS
    }
  },
  timeout: 10000
});

exports.handler = async (event) => {
  // Reuse the client for each call
  const response = await httpClient.get('https://api.example.com/data');
  return {
    statusCode: 200,
    body: JSON.stringify(response.data)
  };
};

This approach reduces cold start time by 200-500 ms, as the proxy configuration is executed only once when creating the Lambda container.

Integrating Proxies in Vercel Edge Functions

Vercel offers two types of serverless functions: Node.js Functions (similar to Lambda) and Edge Functions (executed on CDN). Edge Functions run in a runtime similar to Cloudflare Workers, with restrictions on the use of Node.js APIs. Let's consider both options.

Vercel Node.js Functions

For regular Vercel Functions, use the same approach as for AWS Lambda. Create a file api/fetch-data.js:

import axios from 'axios';

export default async function handler(req, res) {
  const proxyConfig = {
    host: process.env.PROXY_HOST,
    port: parseInt(process.env.PROXY_PORT),
    auth: {
      username: process.env.PROXY_USER,
      password: process.env.PROXY_PASS
    }
  };

  try {
    const response = await axios.get(req.query.url, {
      proxy: proxyConfig,
      timeout: 8000
    });

    res.status(200).json(response.data);
  } catch (error) {
    res.status(500).json({ error: error.message });
  }
}

Add environment variables in the Vercel Dashboard (Settings → Environment Variables): PROXY_HOST, PROXY_PORT, PROXY_USER, PROXY_PASS.

Vercel Edge Functions

Edge Functions use the Web Fetch API instead of Node.js libraries. Proxies are configured through custom headers or middleware:

export const config = {
  runtime: 'edge',
};

export default async function handler(req) {
  const proxyUrl = `http://${process.env.PROXY_USER}:${process.env.PROXY_PASS}@${process.env.PROXY_HOST}:${process.env.PROXY_PORT}`;
  
  // For Edge Runtime, fetch with proxy through agent (requires polyfill)
  // Alternative: use proxy API directly
  const targetUrl = new URL(req.url).searchParams.get('target');
  
  const response = await fetch(targetUrl, {
    headers: {
      'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'
    }
  });

  return new Response(await response.text(), {
    status: response.status,
    headers: response.headers
  });
}

Important Limitation: Edge Runtime does not support standard Node.js proxy agents. For full proxy functionality, it is recommended to use Node.js Functions or create an intermediate proxy server on a separate server that will accept requests from Edge Functions.

Proxies in Cloudflare Workers

Cloudflare Workers operate in V8 isolates and have even stricter limitations than Vercel Edge Functions. The standard way to connect proxies through Node.js libraries does not work here. There are two working approaches:

Method 1: HTTP CONNECT Tunneling

Use a proxy that supports the HTTP CONNECT method. Create a tunnel through the proxy server:

addEventListener('fetch', event => {
  event.respondWith(handleRequest(event.request))
})

async function handleRequest(request) {
  const proxyUrl = 'http://proxy.example.com:8080';
  const targetUrl = 'https://api.example.com/data';
  
  const proxyAuth = btoa(`${PROXY_USER}:${PROXY_PASS}`);
  
  const response = await fetch(proxyUrl, {
    method: 'CONNECT',
    headers: {
      'Host': new URL(targetUrl).host,
      'Proxy-Authorization': `Basic ${proxyAuth}`
    }
  });

  if (response.status === 200) {
    // Tunnel established, perform the main request
    const finalResponse = await fetch(targetUrl, {
      headers: {
        'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64)'
      }
    });
    
    return finalResponse;
  }
  
  return new Response('Proxy connection failed', { status: 502 });
}

This method only works with HTTP proxies that support CONNECT. Most residential proxy providers offer this capability.

Method 2: Proxy Gateway (Recommended)

A more reliable way is to deploy an intermediate proxy gateway on a separate server (for example, on a VPS or AWS EC2). The Cloudflare Worker sends requests to your gateway, which then forwards them through the proxy:

// Cloudflare Worker
addEventListener('fetch', event => {
  event.respondWith(handleRequest(event.request))
})

async function handleRequest(request) {
  const targetUrl = new URL(request.url).searchParams.get('url');
  const gatewayUrl = 'https://your-proxy-gateway.com/fetch';
  
  const response = await fetch(gatewayUrl, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'X-API-Key': API_KEY // Protect your gateway
    },
    body: JSON.stringify({
      url: targetUrl,
      method: 'GET'
    })
  });

  return response;
}

On the proxy gateway side (Node.js server):

const express = require('express');
const axios = require('axios');

const app = express();
app.use(express.json());

const proxyConfig = {
  host: 'proxy.example.com',
  port: 8080,
  auth: {
    username: process.env.PROXY_USER,
    password: process.env.PROXY_PASS
  }
};

app.post('/fetch', async (req, res) => {
  if (req.headers['x-api-key'] !== process.env.API_KEY) {
    return res.status(401).json({ error: 'Unauthorized' });
  }

  try {
    const response = await axios({
      url: req.body.url,
      method: req.body.method || 'GET',
      proxy: proxyConfig,
      timeout: 10000
    });

    res.json(response.data);
  } catch (error) {
    res.status(500).json({ error: error.message });
  }
});

app.listen(3000);

This approach adds an additional hop (increasing latency by 50-100 ms) but provides full compatibility and control over the proxy connection.

IP Address Rotation in Serverless Environments

One of the main reasons for using proxies is to distribute requests among multiple IP addresses to bypass rate limits. In serverless architecture, there are two approaches to rotation:

Automatic Rotation on the Proxy Provider Side

Most residential proxy providers offer rotating proxies — you connect to one endpoint, and the IP changes automatically with each request or at a set interval (for example, every 5 minutes). This is the simplest option for serverless:

// One endpoint, IP changes automatically
const proxyConfig = {
  host: 'rotating.proxy.example.com',
  port: 8080,
  auth: {
    username: 'user-session-' + Date.now(), // Unique session
    password: 'password'
  }
};

Some providers allow managing rotation through parameters in the username: user-session-random (new IP for each request), user-session-sticky-300 (one IP for 300 seconds).

Manual Rotation Through a Proxy Pool

If you have a list of static proxies (for example, you purchased dedicated proxies), you can implement rotation at the application level. In a serverless environment, use DynamoDB (AWS) or KV Storage (Cloudflare) to store the state:

const AWS = require('aws-sdk');
const dynamodb = new AWS.DynamoDB.DocumentClient();

const PROXY_POOL = [
  { host: 'proxy1.example.com', port: 8080 },
  { host: 'proxy2.example.com', port: 8080 },
  { host: 'proxy3.example.com', port: 8080 }
];

async function getNextProxy() {
  // Get the current index from DynamoDB
  const result = await dynamodb.get({
    TableName: 'ProxyRotation',
    Key: { id: 'current_index' }
  }).promise();

  const currentIndex = result.Item?.index || 0;
  const nextIndex = (currentIndex + 1) % PROXY_POOL.length;

  // Update the index
  await dynamodb.put({
    TableName: 'ProxyRotation',
    Item: { id: 'current_index', index: nextIndex }
  }).promise();

  return PROXY_POOL[currentIndex];
}

exports.handler = async (event) => {
  const proxy = await getNextProxy();
  
  const response = await axios.get('https://api.example.com/data', {
    proxy: {
      ...proxy,
      auth: {
        username: process.env.PROXY_USER,
        password: process.env.PROXY_PASS
      }
    }
  });

  return { statusCode: 200, body: JSON.stringify(response.data) };
};

This method gives full control over rotation but requires additional requests to DynamoDB (adding 10-30 ms latency). For high-load applications, it is recommended to cache the index in the Lambda container memory and update it every 100-1000 requests.

Error and Timeout Handling

Proxies add an additional point of failure to your serverless application. It is critically important to handle errors properly to avoid losing user requests.

Common Errors When Working with Proxies

Error Cause Solution
ETIMEDOUT Proxy is unresponsive or slow Reduce timeout to 5-8 seconds, add retry with another proxy
ECONNREFUSED Proxy server is unavailable Check proxy availability, use fallback to another proxy
407 Proxy Authentication Required Invalid credentials Check username/password, ensure Lambda IP is whitelisted in the proxy
502 Bad Gateway Proxy cannot connect to the target site The site may block the proxy, try another IP or type of proxy

Implementing Retry Logic with Fallback

Add automatic retries with a switch to a backup proxy in case of errors:

const axios = require('axios');

const PRIMARY_PROXY = {
  host: 'primary.proxy.com',
  port: 8080,
  auth: { username: 'user', password: 'pass' }
};

const FALLBACK_PROXY = {
  host: 'fallback.proxy.com',
  port: 8080,
  auth: { username: 'user', password: 'pass' }
};

async function fetchWithRetry(url, maxRetries = 3) {
  for (let attempt = 0; attempt < maxRetries; attempt++) {
    const proxy = attempt === 0 ? PRIMARY_PROXY : FALLBACK_PROXY;
    
    try {
      const response = await axios.get(url, {
        proxy,
        timeout: 8000
      });
      
      return response.data;
    } catch (error) {
      console.log(`Attempt ${attempt + 1} failed:`, error.message);
      
      // Do not retry on client errors (4xx)
      if (error.response && error.response.status < 500) {
        throw error;
      }
      
      // Last attempt — throw the error
      if (attempt === maxRetries - 1) {
        throw error;
      }
      
      // Exponential delay before retry
      await new Promise(resolve => setTimeout(resolve, 1000 * Math.pow(2, attempt)));
    }
  }
}

exports.handler = async (event) => {
  try {
    const data = await fetchWithRetry('https://api.example.com/data');
    return { statusCode: 200, body: JSON.stringify(data) };
  } catch (error) {
    return { statusCode: 500, body: JSON.stringify({ error: error.message }) };
  }
};

This implementation provides three request attempts: the first through the primary proxy, the others through the backup. An exponential delay (1 sec, 2 sec, 4 sec) is added between attempts to avoid excessive load.

Monitoring and Alerts

Set up monitoring for proxy errors through CloudWatch (AWS), Vercel Analytics, or Sentry. Track the following metrics:

  • Percentage of successful requests through the proxy (should be >95%)
  • Average latency of requests (an increase may indicate proxy issues)
  • Number of timeout errors (if >5% — proxy is overloaded or slow)
  • Distribution of errors by code (407, 502, ETIMEDOUT, etc.)

Set up alerts when thresholds are exceeded — this will allow you to quickly switch to a backup proxy provider or change the configuration.

Conclusion

Integrating proxies into serverless applications addresses critical issues: IP blocking from data centers, rate limiting, and geo-blocking. We discussed setting up proxies in AWS Lambda (Node.js and Python), Vercel Functions, and Cloudflare Workers, as well as implementing IP address rotation and error handling.

Key recommendations: use residential proxies for tasks requiring high anonymity (scraping, working with social media APIs), store credentials in secure storage (AWS Secrets Manager, Vercel Environment Variables), implement retry logic with fallback to backup proxies, and set up error monitoring.

For serverless applications with high stability requirements, we recommend using residential proxies with automatic rotation — they provide an optimal balance between speed, anonymity, and reliability, minimizing the risk of blocks when working with external APIs and services.