Akamai Blog | Making the Edge Work for You
For many people, “building at the edge” may conjure fantastical images of nearly sci-fi–like computational power embedded on devices all around us, crunching massive volumes of data. We’re getting there. Today’s reality is that more workloads are moving to the edge to take advantage of the benefits, but many are still too resource-intensive to be feasible. For the workloads that can be moved, the results can lead to transformational digital experiences for users and developers alike.
As part of the April 2022 platform update, we’re making it easier for developers to build, execute, and debug EdgeWorkers functions.
● EdgeWorkers now integrates with Visual Studio Code and IntelliJ, so you can create, modify, and activate EdgeWorkers functions in your favorite integrated development environments. This gives developers an end-to-end, centralized place to work and eliminates the pain points of developing all these functions in separate locations.
● DevOps teams can now use Akamai Terraform provider to provision and manage EdgeWorkers functions. Combined with other Akamai products, this integration makes it easier for developers to manage Akamai’s infrastructure. Visit our Terraform support page to learn more about how to use Akamai Terraform on GitHub, view documentation via Hashicorp, and learn more about administering and managing configurations.
● Developers can now use DataStream 2 to get real-time visibility into the usage and execution details of EdgeWorkers functions, providing essential metrics such as heap usage, CPU and wall time, and detailed insight into execution time and errors.
We’re also introducing new capabilities to help customers do more at the edge using Akamai EdgeWorkers across a host of common use cases. Those updates include:
● Higher resource tier limits that allow you to build and execute larger and more complex functions. For both the Basic and Dynamic compute tiers, updates to maximum CPU time and wall time during initialization are doubled to 60 and 200 milliseconds, respectively. Additionally, at the Dynamic compute tier, the maximum wall time per HTTP sub-request during the execution of all event handlers increases from 1 to 1.5 seconds.
● Support for customers using Standard TLS extends EdgeWorkers beyond traditional applications and websites to help customers deliver complex, segmented video streams.
Finally, for developers using Akamai who are interested in cooking up their own EdgeWorkers functions, we’re introducing EdgeWorkers recipes on Tech Docs. The recipes make it simple to get started, showing sample code along with clear and concise documentation on what the code is doing.
Akamai started our compute journey with serverless computing at the edge, because it made sense for us. Akamai has always been the edge for our customers. It’s in our DNA to be at the edge. And it’s because we’re already there that we’re helping our customers transform their application architectures. But the edge is just the beginning.
What’s next?
Keep an eye out for more platform update news this week. We’ll explore ways that Akamai is anticipating evolving threats and adapting protections to bring trust to your digital experiences. We’ll also bring into focus Akamai’s Zero Trust offering to help you better secure the workforce. And finally, we’ll dive head first into the future and discuss what it means for you. We’re no longer just delivering the edge — we’re helping customers build more powerful digital experiences.