Why Developers Are Writing Apps on Our Edge Platform
A lot of companies talk about edge computing today, but at Akamai, we've been doing it for more than 20 years. We were the first company to offer edge computing services, beginning in 1999 with advanced metadata, an XML-based language used to enable a variety of capabilities on our edge platform -- things like access revocation, ad insertion, throttling, and visitor prioritization.
We followed this with our deployment of Edge Side Includes in 2001, which helped our customers scale, increase performance, and save money by moving fine-grained business logic that would have been done locally to our edge, reducing the amount of data that their infrastructure needed to process in a datacenter. We then introduced Edge Java in 2002, to help enterprises maximize business efficiencies and scale by enabling web applications to execute on the edge. These early innovations anticipated the prevailing use cases for application-aligned edge computing today.
As it turned out, we were ahead of the market with edge computing. AWS and Azure hadn't even been started yet. And most customers weren't ready to migrate their apps to our platform. They could do it, but it wasn't as easy as it is today. And so, we decided to build apps ourselves for common tasks like video management, visitor prioritization, and application load balancing. This worked well because the customer didn't need to do any programming.
With the widespread adoption of cloud services such as AWS and Azure, the market is now much more capable of writing apps and microservices that are designed to work in a cloud environment. And, as we look to the future, we see substantial opportunity for enterprises to increase their use and leverage of the Akamai Intelligent Edge Platform. We believe the edge is where new applications and new business models will come to life, where intelligence will be built into how data is collected and analyzed, where the promise of 5G and IoT will be realized, and where security will provide the online world's first and most important line of defense.
Because of the growing interest in edge computing, a lot of companies talk about being at the edge today, but few seem to offer a clear understanding of what the edge really means.
The edge as we define it
The edge is the part of the internet that's closest to end users. This is what traditionally has been called "the last mile." It is the portion of the internet that sits between the cloud data centers at the core of the internet and the billions of devices that are in homes and offices or that are moving around in mobile networks.
Major cloud companies and the other CDNs aren't at the edge. They place their infrastructure and content in a few dozen cloud data centers in the core of the Internet. Some of them call that the edge, but they're really in the core of the internet, not especially close to end users and often on the wrong side of congested peering points. Enterprises and hosting companies also usually locate their data centers in the core of the Internet, which is often far from most users and even many of their own employees.
Akamai's platform is deployed differently. We are truly at the edge, with more than 340,000 servers located in more than 4,000 different locations across more than 1000 cities in 135 countries, inside more than 1,400 different networks, well past congested peering points, and close to where all the end users are on the Internet. We operate at a massive scale, handling trillions of requests per day and well over 100 terabits per second of concurrent traffic around the clock.
Our presence at the edge makes Akamai unique. No other company comes anywhere close to this kind of distribution and proximity to end users. And that gives Akamai a big advantage over the competition when it comes to scale, performance, reliability, security, and cost. Our edge platform is the largest and most trusted cloud delivery platform in the world.
What edge servers do
It's also important to understand what all these edge servers are doing. Of course, they distribute popular content like movies and gaming software, but they also work together as an intelligent fabric to mitigate and minimize network congestion, route around trouble spots on the Internet, and to accelerate access to content that is typically not cached locally, like an ecommerce shopping cart, or your account balance viewed on a mobile banking app.
Network operators are happy to have Akamai servers in their network because we can help them avoid overload by diverting traffic away from areas with congestion, the way GitHub diverted one of the largest DDoS attacks ever. And we can improve web performance for all of the content that we serve to network operators' subscribers, which is a lot of the content on the Internet.
Because we see so much of the world's Internet traffic, our servers also collect a vast amount of data that we can provide to our customers in real time, so that they can optimize the business value of their sites and apps. And the same Akamai edge server that's doing all these things is also providing security designed to protect websites, apps, enterprise communications and data. In all, a typical Akamai edge server is performing dozens of functions on behalf of thousands of customers at any given time.
EdgeWorkers and use cases for developers
Akamai edge services are also programmable by our customers and third parties. They are optimized to respond within a few milliseconds to requests to run an app. Customer applications instantiated our EdgeWorkers solution more than 480 billion times in the first quarter of 2021 when we also supported more than 78 trillion API requests.
We created Akamai EdgeWorkers so that customers could easily port their JavaScript apps to our edge platform. We've deployed Chrome V8 engines to our edge servers, which effectively turns the world's most distributed CDN footprint into the world's most distributed serverless computing platform, where every edge server becomes a compute node. ("Serverless computing" is a way of saying the apps run on our platform instead of on the customer's infrastructure. From the customer's perspective, they don't need to worry about any server infrastructure, and they can scale up instantly on Akamai.)
Today, customers are using EdgeWorkers for things such as API orchestration, personalization, search engine optimization, compliance, and virtual waiting rooms.
Third-parties are beginning to write apps using EdgeWorkers, which creates a very interesting future potential of an app marketplace running on our Edge Platform.
A leading theme park operator implemented EdgeWorkers to help manage demand as it plans to re-open parks as the Coronavirus pandemic subsides.
A sports entertainment business has implemented geo-fencing using our EdgeKV data store together with EdgeWorkers to ensure that users access only relevant content for their location.
A home improvement retail chain and a global credit card company have put in place A/B testing logic at the edge to deliver fast, personalized user experiences.
We've enabled DevOps workflows for a global sportswear brand by managing canary releases, where only a targeted group of users can see a new experience.
We've also enabled a leading global manufacturer of PCs and other end user devices to authenticate its users at the edge, which improves performance and reduces its cloud costs.
And consider the global consumer electronics brand that experienced such a large spike in demand for a mega sales event that it needed to stop the event. To prepare for the surge in demand and to protect its origin infrastructure from the anticipated traffic surge, the company had implemented a third-party waiting room solution. But fraudsters figured out how to skip the queue in the waiting room, which drove an unsustainable surge in origin traffic, crashing the big event.
To get back online, the merchant moved the third-party waiting room onto Akamai using our EdgeWorker solution. And because it ran on our servers instead of the client, bad actors could no longer circumvent the waiting room logic. This enabled the merchant to control the traffic that was allowed into the checkout flow, ensuring that the sales event went smoothly, and its users then had a great experience.
In a similar way, Akamai was called in to rescue government agencies whose COVID vaccination registration sites crashed initially under heavy loads, resulting in frustrated citizens who waited long times before getting kicked out without a reservation. Today, our new Vaccine Edge solution is enabling more than two dozen public health agencies and major pharmacy chains around the world to deliver and secure COVID vaccine registration sites in the face of extraordinary flash crowds.
Capabilities like these explain why revenue from our Edge Applications portfolio, which includes our discretely-billed Edge Computing solutions, grew 33% year-over-year in the first quarter of 2021.
Research firm Gartner predicts that by 2025, three-quarters of enterprise-generated data will be created and processed at the edge, outside centralized data centers or the clouds. This could be one of the reasons why Gartner has said "the edge will eat the cloud." And it's another reason why we believe that as more computing happens at the edge, rather than in the core, more customers will migrate to the edge, where Akamai is now and always has been, for more than 20 years.