Serverless functions have become the default compute layer for modern web applications, but the three major platforms differ in meaningful ways. Cloudflare Workers run on V8 isolates at the edge with sub millisecond cold starts. AWS Lambda runs containers in specific regions with more runtime flexibility. Vercel Functions are Lambda under the hood but wrapped in a developer experience optimized for frontend frameworks. For the cloud and DevOps work we do, the right choice depends on your workload pattern, latency requirements, and how much infrastructure management you want to take on. We have written about this tradeoff in our Vercel vs AWS comparison, but this post focuses specifically on the serverless compute layer and when each function platform earns its place. If you need help choosing the right approach for your project, tell us about your situation.
AWS Lambda vs Cloudflare Workers vs Vercel Functions: Which Serverless Platform
| Feature | Cloudflare Workers | AWS Lambda |
|---|---|---|
| Cold Start Time | Near zero. V8 isolates spin up in under 5ms. No noticeable cold start for end users | 50ms to 500ms+ depending on runtime and package size. Provisioned concurrency eliminates cold starts but costs extra ($0.015 per GB hour) |
| Pricing (Compute) | Free tier: 100,000 requests per day. Paid: $5 per 10 million requests plus $12.50 per million GB seconds of CPU time. Billed only for CPU time, not I/O wait | Free tier: 1 million requests per month. $0.20 per million requests plus $0.0000166667 per GB second of total duration including I/O wait |
| Execution Location | Runs on 300+ edge locations globally. Code executes closest to the user. True edge computing by default | Runs in a single AWS region by default. Lambda@Edge available for CloudFront but with limitations on runtime and package size |
| Runtime Support | JavaScript and TypeScript only (V8 isolate). No native Node.js APIs. WASM support for other languages | Node.js, Python, Java, Go, .NET, Ruby, and custom runtimes via containers. The widest language support of any serverless platform |
| Max Execution Time | 30 seconds on the free plan. 15 minutes on paid plans. Sufficient for API routes but not for long running jobs | Up to 15 minutes. Can be extended with Step Functions for workflows that take hours or days |
| Package Size | 10MB compressed limit including dependencies. Forces lean code but limits what libraries you can use | 50MB zipped, 250MB unzipped. Container images up to 10GB. Can bundle heavy dependencies like Puppeteer or Sharp |
| Ecosystem Integration | Workers KV, Durable Objects, R2 storage, D1 database, and Queues. A growing but smaller ecosystem than AWS | Connects to 200+ AWS services natively. DynamoDB, SQS, S3, EventBridge, and more. The deepest cloud ecosystem available |
| Developer Experience | Wrangler CLI for local dev and deployment. Fast iteration cycles. Simpler configuration than AWS but fewer escape hatches | SAM, CDK, Serverless Framework, or raw CloudFormation. Powerful but complex. Local testing with SAM requires Docker. Steeper learning curve |
| Observability | Workers Analytics and Logpush. Basic but improving. Third party integrations available for more depth | CloudWatch Logs, X-Ray tracing, and CloudWatch Metrics. Comprehensive but requires setup. Most teams add Datadog or similar on top |
Why Cloudflare Workers
- +Near zero cold starts mean every request is fast, no provisioned concurrency tricks needed
- +True edge computing on 300+ locations puts your code within 50ms of every user on earth
- +CPU only billing means you do not pay for time spent waiting on database queries or API calls
- +Deployment takes seconds and goes live globally, no region selection or replication to manage
- +Free tier of 100,000 requests per day is generous enough to run small production workloads at zero cost
Why AWS Lambda
- +Supports every major programming language including Python, Java, Go, and .NET with full standard library access
- +Connects natively to 200+ AWS services for complex workflows involving queues, databases, and event streams
- +Container image support up to 10GB allows bundling heavy dependencies like machine learning models
- +15 minute execution limit with Step Functions for multi hour workflows covers every compute pattern
- +Mature observability with CloudWatch and X-Ray provides deep debugging and performance analysis out of the box
The Verdict
Our Honest Take
Cloudflare Workers is the right choice for API routes, middleware, and any user facing serverless function where latency matters. The edge execution model and zero cold starts deliver a better experience for end users, and CPU only billing saves money on I/O heavy workloads. Choose AWS Lambda when you need languages other than JavaScript, heavy dependencies over 10MB, or deep integration with the AWS ecosystem for event driven architectures. Vercel Functions, which we did not table here, is the best option when you are deploying a Next.js or SvelteKit application on Vercel because the integration is seamless and you do not manage any infrastructure. Veld typically uses Cloudflare Workers for edge API routes and Vercel Functions for framework specific server side rendering.
Ready to Build?
Let us talk about your project
We take on 3-4 projects at a time. Get an honest assessment within 24 hours.
Related articles
Serverless vs Kubernetes: Which Infrastructure to Choose
Comparing serverless (Lambda, Edge Functions) and Kubernetes for production infrastructure, complexity, cost, scaling, and when each approach wins.
Stripe vs Square: Which Payment Platform to Choose
Comparing Stripe and Square for payment processing, API quality, pricing, online vs in person payments, and developer experience. A practical guide.
React vs Vue: Which Frontend Framework to Choose
Comparing React and Vue for frontend development, ecosystem, performance, mobile support, hiring, and developer experience. A practical guide for choosing.