Need cloud computing? Get started now

Dark background with blue code overlay
Blog

Security's Role in Internet Resilience

Rich Salz

Written by

Rich Salz

December 09, 2021

Rich Salz

Written by

Rich Salz

Rich Salz is a Principal Architect in Architecture & Technology Strategy at Akamai. He has been involved in the definition and implementation of internet and security standards for more than 20 years, currently with the IETF and the QuicTLS toolkit. At Akamai, he works on making systems and customers more secure by default.

One aspect of resilience on the internet is that things — notably servers and resources — move around. Sometimes moves are legitimate, such as when a popular site evolves from hosting their own website to moving to a cloud provider to using a CDN to handle the ever-increasing traffic. Sometimes the moves are not legitimate, such as when an attacker pretends to be an ecommerce or banking site and steals a user’s credentials upon login. How can the end user tell the difference between legitimate and not-so-legitimate moves?

One answer to this question is Transport Layer Security, known as TLS (previously known as SSL/TLS, but nobody should be using SSL any more). Think of it as the "s" in https URLs. TLS works by having a third party, known as a certificate authority (CA), digitally sign a data blob that includes the server name and a "key" that can be used to communicate with that server. Browsers have a built-in list of CAs that they trust, which generally includes those that follow issuance and verification guidelines from the CA/Browser Forum.

One of the internet's most popular CAs is Let’s Encrypt. It is novel because a certificate is free and is good for only 90 days. The Internet Security Research Group, which runs Let’s Encrypt, started this because they are a nonprofit organization and they want to "encrypt the whole web." They have been very successful at this. One measure is seeing that browser plug-ins that tried to use TLS first (such as Electronic Frontier Foundation’s HTTPS Everywhere) are being withdrawn because most sites are now https sites, not http sites. Akamai recognized the importance of TLS for everyday use, and we were proud to be one of the founding sponsors of Let’s Encrypt.

Using TLS, and the browser's list of trusted CAs, can protect the user. But sometimes the websites need to be protected from the users.

A bot (from the word robot) is a program running on an unsuspecting user's computer. Hundreds or thousands of these are known as a botnet, and their command servers tell the infected machines what to do. For example, they could try to flood an attack target with traffic, leading to a denial-of-service (DOS) attack. Akamai can filter this traffic before it gets to the origin, so that valid traffic gets through and DOS traffic is ignored. Our Prolexic service provides this kind of protection.

Sometimes an attacker doesn't need hundreds of sites; maybe just a few are enough to accomplish the goal. For example, when an ecommerce site offers a limited number of one particular item, such as a branded sneaker. The bots will flood the site to buy all the inventory, hoping to profit by selling them again later online. (You'd be surprised at how much money can be made by doing this, and how important it is to the vendors that it be stopped; I know I was.) Our bot detection security products use a variety of techniques to identify this kind of activity, and only allow legitimate human consumers through.

Going a level deeper, cryptography makes most of this possible. One way to look at this is to group things by algorithm, such as RSA, Elliptic Curve, AES, and so on, and key size, which is often measured in bits. A decade ago, a 512-bit RSA key was acceptable; now anything smaller than 2048 bits is considered bad. The bigger the key size, the longer it takes to compute results. Using bigger keys means an attacker has more work to do, but it also means that legitimate uses, such as making a TLS connection, also take more CPU effort. Both users and security experts prefer Elliptic Curve because they get the same attack protection, but with a much smaller key. Cryptographic agility lets protocols, such as TLS or a digital signature standard, indicate which algorithms they are using, and provides "space" to change to new algorithms when needed.

One likely change in the upcoming decade will be to post-quantum cryptography algorithms. A quantum computer is a new type of computer that could break all existing long-term keys pretty easily. So far, these computers can only factor numbers like 35, and not the hundreds of digits in RSA keys. But the industry is preparing for this now: The U.S. government is running an international competition over several years to find good algorithms that would be safe from quantum computers once they become practical. We are watching these events, and will be sure to keep our customers' secrets safe by using post-quantum algorithms before they become necessary.



Rich Salz

Written by

Rich Salz

December 09, 2021

Rich Salz

Written by

Rich Salz

Rich Salz is a Principal Architect in Architecture & Technology Strategy at Akamai. He has been involved in the definition and implementation of internet and security standards for more than 20 years, currently with the IETF and the QuicTLS toolkit. At Akamai, he works on making systems and customers more secure by default.