Build and run modern apps and AI with rapid, highly distributed serverless functions
Akamai Functions is an edge‑native serverless platform that runs WebAssembly (Wasm) functions across the world’s most distributed cloud. Teams deploy once and run everywhere — without managing regions or global infrastructure — to deliver low‑latency experiences, cut egress and origin load, and move faster.
Instant startup, effortless scale
Wasm functions start in microseconds and autoscale globally, keeping apps and AI responsive under traffic.
Global by default, no complexity
Deploy once and run everywhere on Akamai’s distributed cloud, instead of planning and operating multi‑region setups.
Open and portable
Built on standards‑based Wasm runtimes and open tooling (Spin, SpinKube) to reduce lock‑in and keep your options open.
How it works
Write
Use Rust, Go, JavaScript, Python, and other languages that compile to Wasm.
Deploy
Package and deploy with Spin and SpinKube — to Akamai’s edge, your own Kubernetes, or both. Use the Spin “aka” plugin to push with a single command.
Run
Execute on a Wasmtime‑based runtime with strong isolation and memory safety — no containers or VMs to manage.
Scale
Your code is placed close to users and routed to the fastest region automatically across Akamai’s global footprint.
What you can build
Execute Wasm at the edge with microsecond startup for APIs, routing, and real‑time logic.
Deploy globally in seconds with a single command; a stable URL persists across deployments.
Run sandboxed functions without containers or VMs; get strong isolation from the Wasm runtime.
Persist and access data with an edge key‑value store; connect to managed SQL databases when needed.
Integrate seamlessly with Akamai CDN and Security — including API Acceleration and App & API Protector — plus managed databases.
Offload auth, token, and bot logic from origins; route heavier work to CPUs/GPUs in Akamai Cloud on demand.
Stream logs and inspect apps with developer tooling designed for distributed environments.
AI at the edge, powered by Akamai Cloud
Akamai Functions combines lightweight Wasm at the edge with GPU‑accelerated inference in Akamai Cloud to deliver intelligent applications at scale:
Pre-/post‑processing runs in Wasm at the edge in microseconds.
Requests are intelligently routed to AI models running on NVIDIA GPU infrastructure in Akamai Inference Cloud only when needed.
The result: sub‑second responses and sub–10 ms interactions for users worldwide, with the economics of edge compute and the power of GPU acceleration.
Languages: Rust, Go, JavaScript, Python, and any language that compiles to Wasm.
Spin: Developer framework and CLI for building fast Wasm apps. Explore Spin.
SpinKube (CNCF): Operate Spin apps on your Kubernetes clusters. Explore SpinKube.
Serverless edge computing, explained
Serverless edge computing runs event‑driven code on infrastructure managed by the provider and placed close to end users. You write and deploy functions; the platform handles provisioning, scaling, high availability, and security. Running at the edge reduces latency for modern web and mobile apps and is ideal for:
Personalization, A/B logic, and header/body transforms
API orchestration, caching, and origin offload
Token validation and access control
Bot mitigation workflows
AI pre-/post‑processing close to users
Akamai Functions vs. traditional serverless
Global by default vs. region‑bound
Traditional serverless is region‑scoped (often requiring multi‑region design). Akamai Functions runs across the most distributed cloud by default.
Microsecond startup vs. cold starts
A Wasm runtime delivers near‑instant startup for consistent low latency under bursty workloads.
Open portability vs. proprietary runtimes
Functions are portable by design via Wasm, Spin, and SpinKube — run on Akamai, your Kubernetes, or another cloud.
Edge orchestration built in
Requests are automatically routed to the fastest responding region; no manual replica management.
Is this the same as EdgeWorkers?
No. EdgeWorkers is optimized for lightweight JavaScript‑based CDN logic. Akamai Functions extends this model with a Wasm runtime to support broader application logic, APIs, data access, and AI workflows — still running globally, close to your users. Many customers use both: EdgeWorkers for focused CDN request/response handling, and Akamai Functions for richer, multi‑component edge applications.
Example use cases
Mass redirects
Keep millions of redirect rules in memory and a KV store at the edge for instant lookups, lower origin load, and faster migrations — no cold starts, no origin dependency.
Token management
Generate, validate, and revoke tokens at the edge for premium content. Enforce policies globally and reduce latency while integrating with multi‑CDN workflows.
Bot management
Pair Akamai Bot Manager detection with programmable mitigations in Functions. Return bot‑specific responses and cache them for efficiency — protect content without degrading real users.
Hyper‑personalization
Combine Functions with LLMs on Akamai Cloud to generate tailored content on demand at the edge, avoiding prebuilt variants and centralized inference latency.
Frequently asked questions
What is Akamai Functions and what does it do?
Akamai Functions is an edge‑native serverless platform that runs Wasm functions globally by default. It lets teams execute application logic and AI workflows within milliseconds of users — without managing servers, regions, or global infrastructure.
How is it different from traditional serverless platforms?
Instead of region‑based deployments and cold starts, Akamai Functions provides global‑by‑default placement on a Wasm runtime with microsecond startup and intelligent traffic routing.
Is Akamai Functions the same as EdgeWorkers?
No. EdgeWorkers focuses on JavaScript CDN logic. Akamai Functions provides a richer Wasm runtime for APIs, data access, and AI workflows. They’re complementary.
What programming languages can I use?
Any language that compiles to Wasm, including Rust, Go, JavaScript, and Python, giving you flexibility without binding to a single runtime.
Can it run real applications and AI workloads?
Yes. Build APIs, microservices, event processing, and AI pre-/post‑processing at the edge. For deep inference, route to GPUs in Akamai Cloud — all on the same platform.
How does it handle global scale and performance?
Deploy once; Functions are distributed across Akamai’s network and routed to the fastest region. They autoscale with traffic and deliver consistent low latency, designed for sub‑10 ms interactions worldwide.
Akamai vs. AWS Lambda for serverless edge and global enterprise needs
Design approach: Akamai Functions is global by default on the most distributed cloud platform; traditional Lambda is region‑scoped by default.
Runtime model: Akamai uses Wasm for microsecond startup and strong isolation; Lambda uses language‑specific runtimes with typical cold start considerations.
Portability: Akamai emphasizes open tooling (Wasm, Spin, SpinKube) so you can also run on your Kubernetes or other clouds.
Enterprise fit: Deep integration with Akamai CDN, Security, and global traffic routing simplifies large‑scale, multi‑region delivery.
Is this a proprietary or locked‑in platform?
No. Akamai Functions is built on open Wasm and WASI standards and works with open source frameworks like Spin and SpinKube. Run the same applications on Akamai, your own Kubernetes via SpinKube, or another cloud.