Don’t let a traffic spike become a Sword of Damocles

Managing traffic peaks with cloud caching

In digital transformation programmes, the number of website visitors has become a popular key performance indicator (KPI) for measuring brand engagement. But the power associated with a traffic spike can fast turn into a perilous “Sword of Damocles” if the website is not prepared. Cloud caching technologies that don’t require many costs upfront offer a solution.

In the world of e-commerce, website traffic is the modern version of ‘footfall’ and every company wants to get as much of it as possible. Indeed the number of site visitors is usually a top KPI. Given that, one would assume that websites would all be prepared for big hits of incoming traffic. Far from it!

Every year, recurring events such as big National Lottery jackpots, the Grand National races and Glastonbury ticket sales all lead to crashing sites and frustrated users venting their anger on social media and in the comment fields of online newspaper articles.

Unforeseen traffic spikes are even more complicated to handle. When the Duchess of Cambridge wore a dress from fashion retailer Reiss during a meeting with the Obamas in May 2011, Reiss’s website crashed for two and half hours.

Caching technology as a solution

Fortunately, there is a technical solution to ensure that websites perform, even at times of peak traffic. Foreseen or unforeseen. The magic solution is intelligent web caching.

A web cache contains copies of all the content of a website. It sits in front of a company’s backend server farm where it intercepts web requests and delivers copies of content to visitors. This means the company’s backend doesn’t have to reproduce multiple impressions of the same content, which can slow down or even lead to a website crash when many visitors arrive at the same time. Thus caching is critical for managing traffic spikes efficiently and it helps to improve website performance.

The alternative – adding infrastructure and caching servers to support traffic peaks that “might” happen is inefficient and expensive, potentially conflicting with other business KPIs to reduce cost and carbon footprint. Fortunately, most caching solutions can be used either on-premise or in the cloud making big infrastructure investments unnecessary.

Cloud caching = flexible web scaling

Cloud caching enables companies to choose when to scale capacity up or down depending on predicted traffic patterns such as last-minute bets at the Grand National. If unpredicted peaks occur, such as in the mentioned Reiss scenario, no hardware investments need to be made – companies just scale up and pay for the infrastructure they actually need and use during the peak.

However, there are scenarios for which a cloud-only deployment is not the right choice. One is when a company needs to maintain hands-on, local control for security or data-handling reasons. There are also certain operational tasks that require a more flexible approach. For example, an US-based global retailer might only have data centres in Nevada and California, but for regulatory compliance, reasons might be required to serve content from other locations in Europe.

In these scenarios, a company doesn’t need to renounce intelligent cloud caching completely but can opt for a hybrid deployment where the caching solution is installed on-premise and is complemented with the cloud version. The advantage of such an architecture is that the traffic can be dynamically directed between on-premise and the cloud, depending on the actual traffic needs.

If there is one thing that’s certain in our uncertain times it’s that traffic peaks can happen to any business at any time. Companies of all sizes require greater flexibility and more cost-efficient solutions to protect their sites, apps and CDNs while ensuring the best possible user experience. Deploying caching in the cloud or as a hybrid model is the most cost-efficient way to ensure that traffic spikes turn into revenue and loyalty, not a Sword of Damocles.

+ posts

Meet Stella

Newsletter

Related articles

“The need for speed” – Finding a way to unlock agility for today’s businesses 

To fully support agility, the solutions chosen will need to enshrine all the latest innovations in areas like artificial intelligence, machine learning or prescriptive analytics.

Preventing data sovereignty from confusing your data strategy

The reason why sovereignty is so important, is that it enables organisations to be innovative with their data and deliver new digital services. Historically, there has been a distinct lack of trust in the cloud, leading to a lack of innovation.

Edge and cloud joining forces

For a decade-and-a-half now, centralised cloud computing has been considered the standard IT delivery platform and little is set to change on that front.

Cybersecurity stress: how to safeguard your organisation and avoid IT burnout 

For organisations to improve their security, they must seek visibility of all the components that go into the software they use.

Improving industrial cyber defences in the cloud

t's an exciting period of change for the industry, as new technologies are woven into operations to streamline services and enhance the customer experience.

Subscribe to our Newsletter