Online game servers use server-side digital tricks to help level the playing field for connected players. Here’s how it works.
The illusion of real-time online gaming is that each player is playing in the present, but interacting with all other connected players in the recent past, as player input in real time has to be compared and verified with the authoritative server state.
This is true of every player in a client-server online game.
One of the biggest factors that can throw off this balance is latency – the time it takes for data to be sent and then received between a client device (in this case, a gaming console or PC) and a hosting server.
To help combat the different latencies of various player connections, online gaming servers often use a system called ‘lag compensation’ to level the playing field and push the illusion of real-time gaming through an interesting method of rewinding time.
Without lag compensation’s rewind trick, each player would be forced to do things like leading their shots in online shooters, taking their individual latency into account to ensure that the shots landed on other players moving across their aim.
Let’s take a closer look at how this system works.
Every action that is performed by a player in an online game takes time to make it to the server. The server must then interpret this data and send it to every other player in the game.
The speed at which data is sent, interpreted, and sent again is usually fast enough that nobody notices, a number of factors can arise to cause a player to experience high latency.
If one player decides to run between two doors on either side of a hallway, only appearing in the open for a moment, high latency may, depending on a number of factors, cause there to be three different states of that player’s character.
In theory, the player doing the running may see themselves safely through the door on the other side. At the same moment, the server may see them right in the middle of the hallway. And on your machine, they may barely have stepped out into the open to begin their wild dash.
Which state is correct?
The above may be an extreme example, as the difference between positions is rarely this large, but it serves to highlight where lag compensation can step in. If you were to fire at the character on your screen, there’s a good chance it would still register as a hit, even if said character was safely tucked away on the other side of the corridor, according to their player’s screen or even according to the server.
As you give the command to fire, your device puts a time stamp on that command. The server reads the time stamp, rewinds and analyses that instant, even though it has already passed, and, if you would have been accurate the server jumps back to the present and registers the hit, no matter where the other player might be now.
On the player side of things, client-side prediction anticipates and renders certain player (client) input to compensate for the time it takes for data packets to travel back and forth between player and server.
In short, if another player is walking in a straight line, your computer might assume that they are still walking in a straight line, displaying this on your screen, until it’s told otherwise by the server. This is client-side prediction.
Recent-history rewinding is an unfortunate, but necessary reality of online games that use lag compensation, because all player inputs are being continuously sent to the server and data is being sent back to players about whether it has been accepted or rejected.
Most of the time, basic inputs such as moving, crouching, jumping, or running are accepted as true by the server, which is why real-time games feel similar to playing single-player games.
An online game that uses lag compensation should feel like it’s playing normally, in real time, until player inputs disagree.
When player input intersects and clashes, this is when the server has to accept one player’s input as true and reject the others as false.
This is where access to fast broadband can help, as online game servers using lag compensation may act on a first-come, first-served basis during these clashes.
A lower-latency player will send its data to the server faster than a high-latency player, which means the lower-latency client is more likely to have an advantage in terms of their input being received by the server first.
A player’s latency in an online game server is representative of the round-trip time for data to be sent to and received back from the server, also sometimes called ‘ping’.
When higher-latency players interact with lower-latency players, the lower-latency player can have their input noticeably impacted because of the time it takes for the higher-latency player’s input data to be sent to the server.
The higher the latency, the longer it takes for player input to be sent and received from a game server.
At a certain point, game servers that use lag compensation will reject input from high-latency players (for instance at 1,000ms, or one full second).
It is worth noting that the overall responsiveness – in other words, the time it takes between player input and receive feedback in the online game world – is also impacted by a number of other factors, including client or server CPU load, client update rate, and server tick rate, to name a few.
Network latency isn’t the only concern when it comes to preserving a real-time online experience, either, because latency can also be introduced from player equipment, such as monitors, keyboards, or even whether a player is using an Ethernet or wi-fi connection.
For every player connected to a relevant server, the real-time illusion of online gaming would be shattered without lag compensation, even though lag compensation tends to deal in milliseconds of recent-history data.