As you begin to compare data center options, you’re probably going to hear a lot about latency — as in, nearly every provider promises the lowest latency rates in the industry.
Latency, in the simplest terms, is the length of the delay between your servers and the users calling up the information. No matter how much bandwidth you have, how far your data center is located from the user, and the configuration of your servers, there will always be latency. Data must travel along fiber optic cables through a series of switches and between networks.
However, because that data moves at the speed of light, that entire journey takes a fraction of a second. The fastest networks can deliver data within 10 milliseconds, which is less time than it takes to blink. And data is getting faster. While speed is generally measured in nanoseconds, some technology companies are pushing to measure speed in picoseconds. For comparison’s sake, consider that one picosecond is to a second as one second is to 31,700 years. In other words, picoseconds are just about the smallest time measurement possible — and are undetectable to most people.
The race to improve data speeds by infinitesimal measurements might seem like needless posturing, but speed does matter — but how much it matters depends on your business.
Who Has the Need for That Much Speed?
Imagine that you’re a stockbroker based in New York making trades with Tokyo. You send orders to buy several thousand shares at a certain price — and in the five seconds it takes for the order to travel overseas, the stock price increases. Those few seconds wind up costing your client thousands of dollars.
In the world of finance, milliseconds matter. Because stock prices change literally by the second, an extended delay between servers can mean the difference between profit and loss. However, it’s not just financial services that need the fastest connections possible. Online auctions, ticket retailers, and any other business that relies on split second timing to sell items needs the fastest possible data speeds in order to make profits and ensure happy customers.
Latency also matters to businesses that aren’t time sensitive, as well. Page loading speed has long been suspected to be a factor in search engine results page rankings, with pages that take longer to load appearing lower in the search results than others. In short, users expect pages to load fast — almost half of users expect pages to load in two seconds or less, and will click away from a site that takes longer than three seconds to load. In addition, research shows that a delay of as little as one second can reduce page views, conversions, and customer satisfaction. Even online retail giant Amazon has seen the effects of latency, noticing that a delay of 100 milliseconds reduced conversions by one percent.
Of course, the location and configuration of the data center is not the only factor that contributes to page loading speed. Plug-ins, graphics, animations, and other site elements can cause sluggish loading speeds and lead customers to click away. That being said, while you might not need to worry about whether your page loads five milliseconds slower than your competitor’s does, you should pay attention to your data center’s latency claims.
What Influences Latency
Several factors contribute to a data center’s latency. Physical distance is one important factor to consider; the closer the center is to the end user, the faster the data transmissions will be. For this reason, most experts recommend that companies choose data centers relatively close to their headquarters or primary customer base, to reduce unnecessary lag time. There is a growing shift, though, toward location being less of a concern, thanks to fiber optics and improved data speeds.
With location out of the equation, there is a greater focus on hardware configurations and routers as a means to reduce latency. Investing in tools that guarantee the maximum speed is a major, but necessary, expense when nanoseconds matter. The good news is that these expenses can be offset by the reduced costs related to having greater choice in data center options. Instead of being limited to centers in major cities — which comes with metropolitan-sized price tags — companies can choose a colocation center in a suburban or rural area that offers better service at a lower cost.
So the answer to the question of whether latency matters is not quite cut and dry. Yes, it does matter, but how much is matters depends on what you are trying to accomplish. As long as your pages load and data transfers fast enough to meet user expectations, you can focus on other priorities — and choose the data center that works best for your company.