Akamai Functions is the next evolution of serverless: a WebAssembly-powered engine that runs application logic and AI workloads on the world’s most distributed cloud. Pair with EdgeWorkers to execute your own code at the edge and create customized, exceptional user experiences.
Serverless, built for distributed systems
Ship code fast without managing infrastructure. Write serverless functions and let the platform handle scale, capacity, and availability.
Akamai runs serverless across a massively distributed edge and cloud so your logic executes closer to users, data, and real-world traffic.
Serverless benefits
Serverless on Akamai
Choose the serverless model for your workload
Serverless on Akamai is delivered through a variety of complementary solutions. All are optimized for different execution patterns and use cases.
Akamai Functions
Distributed serverless for modern apps and APIs
Akamai Functions provides event-driven, function-based compute built on WebAssembly. It is designed for application logic, APIs, and emerging workloads that need portability, fast start-up, and global scale.
Use Akamai Functions when you need to:
- Start up instantly
- Run application logic without containers or VMs
- Process events and data at scale
- Support AI inference and modern workloads
- Use languages other than JavaScript
EdgeWorkers
Lightweight JavaScript at the edge
EdgeWorkers lets you run JavaScript directly on the request path. It is designed for ultra-low-latency logic that needs to execute inline with traffic.
Use EdgeWorkers when you need to:
- Implement authentication and authorization logic
- Route traffic dynamically
- Enforce logic or business rules at the edge
- Extend CDN behavior with custom logic
Use Akamai Functions and EdgeWorkers together
Run real application logic everywhere at the edge with Wasm, then layer in instant JavaScript control to shape traffic and user experiences globally.
| Use case | Akamai Functions | EdgeWorkers |
|---|---|---|
| Request-path logic | Yes | |
| CDN and security customization | Yes | |
| APIs and back-end services | Yes | |
| AI and data workloads | Yes | |
| WebAssembly runtime | Yes | |
| Ultra-low-latency edge execution | Yes | Yes |
Resources
Frequently Asked Questions (FAQ)
Traditional serverless often runs in centralized regions, which can add latency for global users. Cold starts, container overhead, and complex scaling models can slow development and increase cost.
Serverless abstracts infrastructure management. The service can be configured without having to manage servers, containers, or capacity, allowing teams to focus on writing and shipping code.
Edge native serverless runs code closer to end users rather than only in centralized cloud regions. This reduces latency, improves responsiveness, and enables real-time decision-making at global scale.
Where code executes directly impacts performance. Running serverless closer to users reduces network hops, lowers origin load, and delivers more consistent performance across geographies.
WebAssembly offers a lightweight execution model with fast start-up times and efficient resource usage. This makes it well suited for serverless workloads that need predictable performance and low latency.
Serverless runs code only when it is needed. This eliminates idle infrastructure, scales automatically with demand, and aligns cost directly with actual usage.
Akamai runs serverless across a highly distributed edge and cloud platform. This enables low-latency execution, consistent global performance, and real-time logic closer to users, devices, and data.
Book a demo
Get a demo of Akamai Functions to see the fastest, most distributed serverless functions engine for modern apps and AI.
Thank you for your submission. One of our consultants will be in touch soon to set up time to speak with you.