Guest post originally published on Snapt’s blog by Iwan Price-Evans

We talk a lot about the importance of cloud computing and moving infrastructure into the cloud for better performance and availability on a global scale. However, with expectations for application performance continuing to rise and people connecting in more remote places than ever, some businesses need to do more than simply maintain a cloud presence – they need to consider the benefit of setting up a hosted edge environment.

A hosted edge can provide a massive reduction in latency, translating to a huge improvement to user experience, more conversions, more revenue, and better customer retention. However, the edge is often harder to secure than a core network, and more variable than the big public clouds. Can businesses enjoy the latency advantage of the edge with application security they can trust?

Why use a Public Cloud?

Public cloud providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform make it easy for businesses to deploy and scale web applications. Compared with using in-house data centers for the same task, public clouds help businesses to:

As a result, businesses using the cloud can go to market faster and (depending on scale) save a lot of money.

For these reasons, cloud platforms have skyrocketed in popularity. However, they are not without their downsides. Their inherent centralization (while very convenient and enabling of incredible economies of scale) means that they have not prioritized geographic distribution. They do not have a data center presence in more remote regions. Even in the United States, cloud data centers are located mainly in large urban areas, leaving smaller towns, cities, and rural areas a long way from their nearest data center.

The cost of this centralized infrastructure is latency.

Understanding Latency

One of the biggest factors that affect internet performance is latency. Latency is the time required for data to flow from a user’s device to a host server and back again – the roundtrip.

Data transmission speeds are subject to fundamental limitations: radio waves through the air, or electrical pulses on a copper wire, or light traveling through a fiber optic cable cannot travel faster than the speed of light in their given medium. Where speed is limited, the time taken to complete a journey increases with distance.

In networking, greater distance usually adds further delay because of additional routing decisions and processing as data flows through each leg of its journey.

The greater the distance between the user’s device and the host server, the longer the latency. While there are exceptions (for example, a complicated routing path and session security protocol could make a short journey take more time than a long journey), in general, this rule holds true.

While we can pride ourselves in the overall speed of internet communication traversing the globe in mere seconds, that latency – that delay – can make a big difference to the responsiveness of the user experience, especially with applications like gaming or streaming services which require low latencies to provide the right experience, or browsing an e-commerce store where the user is likely to request a lot of pages in a short time.

Why use a Hosted Edge Environment instead of a Public Cloud?

Hosts naturally want to reduce latency for their users where possible. This means deploying applications and data geographically close to their users.

Even though the large public cloud providers are growing their data center presence every year, this proliferation is expensive and these cloud data centers are still far away from many remote places. Most businesses cannot afford to deploy their own data centers in remote locations to get a latency advantage of their own.

This is where a hosted edge environment comes in. In a hosted edge environment, applications can run at various distributed nodes around the globe, reducing the distance that data travels to the end-user and therefore the latency. These servers do not need to offer the functionality of a full data center and so it is easier and cheaper for different server hosts to set them up and distribute them to remote locations while connecting them with a centralized hosted edge provider.

Application developers deploy their applications and data to a centralized location and the hosted edge provider distributes these to the nodes close to the end-users. This allows them to reduce latency for end-users while maintaining central control and avoiding direct infrastructure costs.

Why use a Hosted Edge Environment instead of a Content Delivery Network?

content delivery network (CDN) like Cloudflare will also distribute data to a network of nodes, where the data is cached and served to end users from a geographically close server, resulting in low latency. What makes a hosted edge environment different from a CDN?

A CDN can cache static data, but it cannot run applications at the edge. Everything from social media profile updates to e-commerce check-outs to cloud gaming requires an application to process data and instructions, not merely to cache static data. These use cases do not benefit much from a CDN; they would have to “call home” to the centralized cloud data center, removing the latency advantage that the CDN ought to provide.

By contrast, a hosted edge environment can run applications at the edge, meaning far fewer requests from end-users to centralized cloud data centers, and a massive latency advantage for use cases that go beyond static data.

Hosted Edge using Vendor Agnostic Networks and Dynamic Workloads

Of course, much as it is difficult to expect cloud service providers to open up data centers everywhere given the expense, it is logistically difficult for a hosted edge provider to open up edge locations everywhere. The cost to a single provider of hosting servers in remote locations would likely wipe out any revenue and yield a negative return on investment (ROI).

Hosted edge providers with a vendor-agnostic architecture can forge agreements with many third-party server hosts in many remote locations to use their local infrastructure on-demand. Local server hosts can be profitable by making their server infrastructure accessible and billable to multiple hosted edge service providers, as opposed to one provider trying to carry the cost.

However, this architecture presents other problems. As the number of nodes and routes expands, traffic flows become more complicated and workflow scheduling becomes more challenging. The risks of overloaded or idle infrastructure grow with complexity; either scenario is damaging to profitability and to the viability of the hosted edge model.

This is where technology like Section’s Adaptive Edge Engine is so useful. Section’s dynamic workload scheduling and traffic routing technology uses machine learning (ML) to forecast traffic and workload patterns; it then provisions appropriate edge resources across this wide network of edge infrastructure providers and routes traffic in response to real-time demands.

How much can a Hosted Edge reduce latency?

This pays off big-time. In a test Section ran, analyzing 5.1 million responses, they were able to show how the hosted edge model drastically reduced latency and improved overall application performance. The table below, courtesy of Section, shows the latency in milliseconds for Section Edge and for typical Cloud providers in a range of scenarios and measurements.

Table showing comparison between Section Edge and Cloud providers in a range of scenarios and measurements

Image source: Section.io

Those are big differences between Section and typical Clouds. This shows that a hosted edge environment can significantly improve the user experience for businesses serving users in locations further away from the data centers maintained by the large public clouds. We live in a busy, global world and we need to take our services to every part of it; a hosted edge is a viable path for many businesses towards this goal.

What about Application Security and Data Security in a Hosted Edge?

There is an argument that storing and processing application data in multiple third-party networks is risky, because of the threat of data leaks, localized intrusions, and different security and compliance standards in different regions.

It’s understandable that many businesses still prefer to use their own edge infrastructure and swallow the enormous expense as the cost of securing their applications at the edge.

Other businesses are content to wait for the big cloud service providers like Amazon Web Services (AWS), Microsoft Azure, Google Cloud, and Oracle Cloud to proliferate data centers to their desired edge locations. The cloud providers cannot consistently achieve the same latency as Section showed, given their limited geographical presence, but some businesses must prioritize security over latency.

However, the Hosted Edge provides a solution to this dilemma. Since hosted edge infrastructure is designed to run applications at the edge, businesses using a hosted edge environment can deploy security applications at the edge. For example, a web application firewall (WAF) running at the edge can secure critical applications and data, block threats and attacks, and prevent fraud, downtime, and compliance failures.

Use Snapt to go from Cloud to Edge

Snapt’s Nova WAF is designed for cloud-native applications like edge deployment.

The Nova WAF is centrally managed and uses patented near-zero latency communications for immediate ADC node feedback and AI-driven security across any number of web application firewall (WAF) installations. The Nova WAF protects against bots, scrapers, data leaks, spammers, SQL injectionsXSS attacksdenial of service, and much more.

Nova’s centralized architecture and resource-efficient dynamic WAF services make it a perfect solution for application security in a hosted edge environment. Section users can request the Nova WAF module from the Section marketplace.

By deploying Snapt Nova on Section’s Edge Compute Platform, you can combine the application security and intelligence of Snapt while simultaneously leveraging the benefits of edge computing and elevate your edge security.