Nigel Pereira gives us the rundown on how the geographical location of a data center can affect all subsequent aspects for the user


While most people worry about RAM, storage, and processing power while setting up or renting data center space, an often-overlooked factor is the actual geographical location of the center.

This is because users only care about two things, and RAM, storage, and processing power aren’t on the list. Speed and latency are the two factors that directly impact the end-user experience.

If your customers are in India and your data center is in Australia, for example, whenever your customers click on or try to interact with your app or website, the information needs to travel to Australia and back.

There are levels to this game

If you missed our previous post on data center tiers, you could find it here, but the bottom line is that all data centers are not equal.

Using the same example as we did above, a tier 1 data center is not going to be better than a tier 4 data center irrespective of the geographical location because there is no substitute for good equipment and redundancy.

That being said, on an even playing field, with both equally equipped data centers, the one closest to the customers is going to provide a better end-user experience in terms of speed and latency.

Sify Technologies – Data Center Services

Another advantage of having your data center near your customers is the fact that site speed and page-loading times are both important criteria for search engines when they rank websites and apps. Everyone who has ever used the internet has come across websites that are just too slow and take forever to respond.

While those websites may be hosted on servers that are far, far away, the chances of you returning to those websites for anything at all are low to zero. In today’s day and age of 5G speeds and no buffering from YouTube or Netflix, everyone expects a certain level of service.

The relationship between location, latency, and lag

ReinieR gaming on the PS2 (Image Credit: Flickr)

Latency is basically the amount of time that your data center takes to process any kind of input from your users, it could be a click, a scroll, a swipe, or even some text. Latency is usually measured in milliseconds (ms) and calculated in Round Trip Time (RTT) which is the time it takes for a packet of data to travel from the client to the server and back again.

Obviously, the quicker the better, so less is more in terms of latency. With regards to gaming, in particular, while anything below 100ms is considered “playable,” what you really want is something between 20 and 45ms.

This is because with competitive real-time online multiplayer games like CS:GO, Quake, World of Warcraft, or even PubG, better latency literally means you see the enemy before they see you. A common term that’s also used as an excuse when people lose or “die” in such games is lag.

You may have heard the phrase “I’m lagging” or “I lagged” which basically means I did what I was supposed to, but the server didn’t respond quickly enough. While latency is the speed at which a data center responds, lag is the delay between the client’s input and the server’s response and it’s not a good thing.

The cutting Edge

The Edge of Computing is right here (Image Credit: pxhere.com)

Now while we can control the geographical locations of our data centers, one thing we absolutely cannot control is the geographical locations of our customers. For example, you may have set up a website or app in India that suddenly gains popularity among Indians in the US.

To deliver a better user experience to your new-found market you are considering the very expensive option of setting up a new data center in the US.

This is where Edge data centers come in and deliver a cost-effective way for you to move your data as close to your customers as possible without breaking the bank.

Edge computing is all about processing as much data as possible at the “edge” of the network or as close to the end-users as possible. This is done in order to reduce both latency and lag and provide an end-user experience that is unlike anything we have seen in the past.

Edge computing is critical to the upcoming 5G infrastructure as well as time-sensitive applications like medical technology, defense equipment, aeronautics, autonomous vehicles, satellites, and the like. Unlike traditional server farms, edge data centers have a relatively small footprint and focus on cached content and cloud resources.

Distance isn’t dead yet

While distance may have suffered a few setbacks with people working from home and everything from groceries to movies being available online, it isn’t quite dead yet.

As far as latency and speed are concerned, location is still a deciding factor.

That isn’t to say, however, that there won’t come a time one day when it won’t matter whether your data center is in Alaska or the Arctic, your users around the world will still get blistering speeds.

In case you missed:

With a background in Linux system administration, Nigel Pereira began his career with Symantec Antivirus Tech Support. He has now been a technology journalist for over 6 years and his interests lie in Cloud Computing, DevOps, AI, and enterprise technologies.

1 Comment

  1. متجر ليزر منزلي on

    Hello there! Quick question that’s completely off
    topic. Do you know how to make your site mobile friendly?
    My blog looks weird when browsing from my iphone4.
    I’m trying to find a theme or plugin that might be able to resolve this issue.
    If you have any recommendations, please share. With thanks!

Leave A Reply

Share.
© Copyright Sify Technologies Ltd, 1998-2022. All rights reserved