Technical solution to eliminate desync in single-player sessions

"
LogoOnPoE wrote:
Regarding making monster AI deterministic, I believe it already is. I noticed it by getting desynced while running into a room with mobs. Lets say you are outside a room with mobs. You run in, and then out to lure them. But due to desync, you only run in on the client, not on the server. However, on the client the mobs do react to you running in, even if they stand still on the server, as witnessed when you try to attack them end get resynced into the room.
What you describe is essentially impossible. We don't trust the client, so if the character didn't move on the server, the character didn't move at all; we rely on the client for input, so if the character didn't issue a command on the client, it didn't move on the server. What causes desync is primarily that the player's commands do not do the same thing on the client as it does on the server, and this is almost entirely a result on monsters being desynced (the only other cause is stun/freeze en route, which accounts for a distinct minority of desyncs). As I said earlier, if you pay attention to your desync teleports, every time you rubberband, and monster (or corpse) also rubberbands.

The whole point of a good simulation is that you don't notice it; it's so similar to the real thing that there's nothing to notice. It's very clear that the client has difficulty predicting the server gamestate; if it was good at predicting it, resynchronization wouldn't be such a big deal.
"
HellGauss wrote:
@Scrotie
You do not need to know the precise time at which input was entered, you need to know this time with respect to the beginning of the instance. This is not absolute time, but relative.
Irrelevant. Whether you are measuring precisely from a clock in Greenwich or measuring precisely from the creation of the instance, a timestamp is a timestamp, it still needs precision to be fully deterministic, and the process would still involve trusting the client. Actually, it's easier with the clock in Greenwich; given latency, how would the client and the server agree on when the instance was created?
When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted.
Last edited by ScrotieMcB#2697 on Nov 22, 2013, 5:13:54 AM
"
monkuar wrote:
"
Rhys wrote:


2) Floating-point calculations will still cause desync. Suppose the client needs to perform a mathematical calculation, such as 37 / 13. The client determines the result to be 2.84615384. But when the server performs the same calculation, with the same values, the exact same bit-patterns inputs, it might instead return 2.84615385. The numbers are slightly different. Why would it do this? There are several reasons: Windows client vs. Linux servers. Different CPU architectures. Different optimizations compiling the servers vs. client. Different system drivers. Whatever the cause, you now have a divergence point, where the simulation may or may not diverge, depending on how those values are used and rounded. In that example, rounding to an integer afterwards isn't a problem, but what if the two values were 7.499999 and 7.500001? One will round to 7 and the other will round to 8. Suddenly the tiny difference isn't so tiny, and so even legit players can end up desynced.


I'm no where near your development skill or whatnot. I'm just a Amateur PHP coder, nothing special.

But... Couldn't the client send over OS info (If Linux, or Windows, or XXX) and then just calculate the floating point numbers for each IF on the server? Then the server could have different calculations for each IF function, then it does the function and spits out the right number?

Just a thought I had, sorry in advance if it's stupid, rather say something than nothing! :)


That wont work, there is no deterministic way to determine the values of floating pointer numbers when doing FP arithmetic, even if you know the client machine. IEEE standard specifies how accurate the calculations are, but not what the results should be (i.e. the result should be between 7.0000001 or 7.00000010 for a value of 7, but the standard wont tell you what that actual number should be). Its not just OS's that control what the result of the calculation can be, its also the CPU. Different CPU's will produce different results for FP calculations (note that this all depends on the compiler)

Long story short, FP calculation are completely undertiminsitic in every sense of the word, and they are done this way for speed. The whole reason why floating point calculations and their capability has been skyrocketing in the past 2 decades is because both GPU's and CPU sacrifice accuracy for speed, and when using FP arithmetic in the conventional way its used (like drawing a rectangle on a screen), the human eye wont notice the difference between a triangle with a side of 35 pixels versus 34 pixels

So it doesn't matter if you have machine X and machine Y with completely same specs and CPU, if you both give them the same FP calculation, they can both produce different results
Last edited by deteego#6606 on Nov 22, 2013, 5:21:44 AM
"
how would the client and the server agree on when the instance was created?


Both the client and the servee agree that the instance is created at t=0.
Roma timezone (Italy)
"
ScrotieMcB wrote:
"
LogoOnPoE wrote:
Regarding making monster AI deterministic, I believe it already is. I noticed it by getting desynced while running into a room with mobs. Lets say you are outside a room with mobs. You run in, and then out to lure them. But due to desync, you only run in on the client, not on the server. However, on the client the mobs do react to you running in, even if they stand still on the server, as witnessed when you try to attack them end get resynced into the room.
What you describe is essentially impossible. We don't trust the client, so if the character didn't move on the server, the character didn't move at all; we rely on the client for input, so if the character didn't issue a command on the client, it didn't move on the server. What causes desync is primarily that the player's commands do not do the same thing on the client as it does on the server, and this is almost entirely a result on monsters being desynced (the only other cause is stun/freeze en route, which accounts for a distinct minority of desyncs). As I said earlier, if you pay attention to your desync teleports, every time you rubberband, and monster (or corpse) also rubberbands.

The whole point of a good simulation is that you don't notice it; it's so similar to the real thing that there's nothing to notice. It's very clear that the client has difficulty predicting the server gamestate; if it was good at predicting it, resynchronization wouldn't be such a big deal.


Well it's pretty easy to check if i'm right or wrong. Not at home right now, but will do, once i get home. Basically play PoE and get to a room with mobs, with open door. Stand outside the room so that the mobs don't react to you and don't move. Then pull out your network cable and run into the room. Did the mobs move? If they did, the client has their movement seed to predict their actions, if not, then it requires their response confirmed by server.
"
LogoOnPoE wrote:


Scrotie, I did say same player input at the same time. No, i'm not forgetting that, and yes, I understand the risk of trusting client timestamps. I just don't think the impact of possible cheats will the that big, and the calculations of said cheats difficult enough, for it not to be too severe. But then again, I don't race, and I prefer solo play, so I can see that this system might (and given GGG's focus on races, would) not be acceptable for those that race or prefer party play or care about ladders. However this is a possible solution to desync, and i'd like it's pros end cons understood before dismissing it, and as I said before, I feel there's some misunderstanding in Rhys's position.


The impact is fairly enourmous, especially due to the fact that GGG's deterministic emulation has to account for variances in latency lag, which in the real world can be massive. So if it trusts client timestamps (within a certain range), if that range is too small, than players can be flagged as cheating when it was just their internet crapping up, and if the range is too high, then it opens a world of possibility of cheating, including allowing players to basically not die due to the client forcing the latency window to be as high as possible (regardless of whether in reality the client is actually experiencing delay).

Anyone who knows anything about security would tell you that this is a retarded idea, so just as Scrotie said, drop it
"
deteego wrote:

The impact is fairly enourmous, especially due to the fact that GGG's deterministic emulation has to account for variances in latency lag, which in the real world can be massive. So if it trusts client timestamps (within a certain range), if that range is too small, than players can be flagged as cheating when it was just their internet crapping up, and if the range is too high, then it opens a world of possibility of cheating, including allowing players to basically not die due to the client forcing the latency window to be as high as possible (regardless of whether in reality the client is actually experiencing delay).

Anyone who knows anything about security would tell you that this is a retarded idea, so just as Scrotie said, drop it


You seem to misunderstand. In the proposed solution latency has no effect on the deterministic simulation. Very high latency might delay the arrival of next snapshot, but it has no effect on deterministic simulation. As I said, I want people to understand the proposed solution before dismissing, and your post suggests that not everyone understands.
"
LogoOnPoE wrote:
"
deteego wrote:

The impact is fairly enourmous, especially due to the fact that GGG's deterministic emulation has to account for variances in latency lag, which in the real world can be massive. So if it trusts client timestamps (within a certain range), if that range is too small, than players can be flagged as cheating when it was just their internet crapping up, and if the range is too high, then it opens a world of possibility of cheating, including allowing players to basically not die due to the client forcing the latency window to be as high as possible (regardless of whether in reality the client is actually experiencing delay).

Anyone who knows anything about security would tell you that this is a retarded idea, so just as Scrotie said, drop it


You seem to misunderstand. In the proposed solution latency has no effect on the deterministic simulation. Very high latency might delay the arrival of next snapshot, but it has no effect on deterministic simulation. As I said, I want people to understand the proposed solution before dismissing, and your post suggests that not everyone understands.


No it does have an impact on the eventual outcome, latency can change the actual outcome of what happens. Its not just latency, the packet may actually never arrive, so what is going to happen then, is the server going to make an assumption about what happens?

I mean whats going to happen if the snapshots vary between the server and client due to latency? If the server thinks there is no latency, then its determined outcome may be that the exile is still alive, because it moved to dodge Vaals lightning beam, where as the clients snapshot would have itself killed (remember, the actual game logic is still decided by the server, we are making an assumption here that the client is more or less able to recreate what the server is doing to prevent desync as much as possible). Anything that involves time will change the determined outcome, and time cannot be determined

The sane solution, would be if the snapshots differ about what happens (and they WILL differ when you take into account latency) would be to say the server is correct and resync, and not just trust the client
Last edited by deteego#6606 on Nov 22, 2013, 5:46:10 AM
That's a lot of pages. What I don't understand in the OP is how this deterministic seed is supposed to be secure. If the server sends it to my client, then it reached my machine via a packet, and is now sitting in RAM somewhere. You can read incoming packets, you can locate the seed in RAM, you can trace back any code that references the seed. If anything, it's just going to hand you a roadmap to everything.
"
deteego wrote:
No it does have an impact on the eventual outcome, latency can change the actual outcome of what happens. Its not just latency, the packet may actually never arrive, so what is going to happen then, is the server going to make an assumption about what happens?

I mean whats going to happen if the snapshots vary between the server and client due to latency? If the server thinks there is no latency, then its determined outcome may be that the exile is still alive, because it moved to dodge Vaals lightning beam, where as the clients snapshot would have itself killed (remember, the actual game logic is still decided by the server, we are making an assumption here that the client is more or less able to recreate what the server is doing to prevent desync as much as possible). Anything that involves time will change the determined outcome, and time cannot be determined

The sane solution, would be if the snapshots differ about what happens (and they WILL differ when you take into account latency) would be to say the server is correct and resync, and not just trust the client


You misunderstand the proposed solution. Let's leave the case when the packet never arrives aside, I have replied to that problem some pages ago, but it's a corner case.

Now what happens normally:
There are two machines that run the game identically, based on player input, and a shared random seed, which is constantly updated (by a different process on the server machine, or by a third machine, doesn't matter).
The client machine captures player input and time that passed since the start of simulation, and with that input runs the simulation for a determined amount of time. After that amount of time it gathers the player actions, along with the results of those actions into a snapshot end sends it to server machine.
The server machine upon receiving the snapshot plays the player actions at identical times from the start of it's simulation, and compares the results of the actions on it;s simulation, with the results of the actions on the client machines simulation. Since the simulation is deterministic and the seed and player input along with timing from the start of simulation are identical, the results will match, unless the client attempts to cheat.

Basically server simulation runs in a delayed lockstep mode to client simulation, and it does not influence the client simulation, unless it detects an error in the client.
"
Kilgamesh wrote:
That's a lot of pages. What I don't understand in the OP is how this deterministic seed is supposed to be secure. If the server sends it to my client, then it reached my machine via a packet, and is now sitting in RAM somewhere. You can read incoming packets, you can locate the seed in RAM, you can trace back any code that references the seed. If anything, it's just going to hand you a roadmap to everything.


The seed is not secure, however it changes every short predetermined interval, lets say every second, so by reading the seed, you can only predict random rolls for one second of gameplay. And since you need to respond within a predefined short amount of time, it would hopefully be difficult enough to calculate and evaluate possible beneficial outcomes, to gain significant advantage. Also please note that item drop seeds are not sent to client, and are handled by server.

Report Forum Post

Report Account:

Report Type

Additional Info