Desync and why it's fixable

"
mehpwned wrote:
I'd say the core of desync isn't an necessarily a client<->server architectural issue. There will always be an authoritative game state in multiplayer games, but desync isn't this bad in every game.

Even 200+ MS of ping does not account for the 3 seconds of desync that happens. It also seems to happen more during peak hours. It appears more like a physics/data processing issue and less of a latency related thing.

There just may be a bottleneck somewhere in their server's engine. It's very hard to multi-thread physics related events in game engines, since they're mainly atomic operations. But you can always get more processing power.

The problem I see is that the server is choking on frames.

I do realize there is no complete solution for desync. But I do believe the amount and extent to which we desync is fixable.

Does anyone know what hardware they are running?



Where do you see the server choking on frames?

All I can say definitively is that I see low frequency and low bandwidth output, this doesn't really say anything about how the server processes.

Prediction accounts for the 3 seconds of desync that happens. The reality is not that it's 3 seconds of desync, it's that it's 3 seconds to a resync, you're pretty much always desynced but it continually diverges/tries to correct for prediction fucking up.
"
mehpwned wrote:
I'd say the core of desync isn't an necessarily a client<->server architectural issue. There will always be an authoritative game state in multiplayer games, but desync isn't this bad in every game.

Even 200+ MS of ping does not account for the 3 seconds of desync that happens. It also seems to happen more during peak hours. It appears more like a physics/data processing issue and less of a latency related thing.

There just may be a bottleneck somewhere in their server's engine. It's very hard to multi-thread physics related events in game engines, since they're mainly atomic operations. But you can always get more processing power.

The problem I see is that the server is choking on frames.

I do realize there is no complete solution for desync. But I do believe the amount and extent to which we desync is fixable.

Does anyone know what hardware they are running?
To answer your question, it's easy to trace the communication in the US to a server farm owned by Softlayer; I presume GGG has some kind of arrangement with them. Probably something from this list. However, a custom deal of some kind can't be ruled out as a possibility, and I can't draw conclusions about European servers.

However, I think you're off track on your logic. I agree as far as it's not so much a latency thing... but why would the breakdown be on the server's side? If it was, the core game gamestate would be inaccurate, not the simulation on the client. Instead, what is going on is that small discrepancies on the client simulation are creating a butterfly effect which goes unnoticed at first (since the initial discrepancy is small) but eventually becomes big trouble. This suggestion aims to fix that.
When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted.
"
ScrotieMcB wrote:

However, I think you're off track on your logic. I agree as far as it's not so much a latency thing... but why would the breakdown be on the server's side? If it was, the core game gamestate would be inaccurate, not the simulation on the client. Instead, what is going on is that small discrepancies on the client simulation are creating a butterfly effect which goes unnoticed at first (since the initial discrepancy is small) but eventually becomes big trouble. This suggestion aims to fix that.


I would hardly call missing/failing at discontinuities "small discrepancies".
"
I would hardly call missing/failing at discontinuities "small discrepancies".
Would you disagree that the proper term for the situation is "butterfly effect?" Because "relatively small discrepancies" is virtually in the definition.

Or are you being so nitpicky that you're upset I left out the word "relatively?"

In any case, GGG likely doesn't have the bandwidth to resync every client continuously. Nor would it be a good idea to do so even if they could; you'd have all kind of minor rubberbands happening very frequently, and it's more prudent from a pure gameplay experience perspective to reserve rubberbanding for more drastic desync.
When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted.
Last edited by ScrotieMcB on Nov 30, 2013, 10:50:54 PM
"
ScrotieMcB wrote:
"
I would hardly call missing/failing at discontinuities "small discrepancies".
Would you disagree that the proper term for the situation is "butterfly effect?" Because "relatively small discrepancies" is virtually in the definition.

Or are you being so nitpicky that you're upset I left out the word "relatively?"


I'm familiar with computational dynamics and geometry (I've written numerous physics engines), so yes I may be anal about some things.

It's hard for me to completely classify the situation as butterfly effect. It somewhat applies, sure, in the sense that divergence will cause a small change to magnify.

Mechanically, these are not small discrepancies to me though, you are missing state changes like reflections, projections and so on (far more than I could list here, including virtually any non-linear functions on state), that causes huge divergence.

Small discrepancies to me would be something like not quite deterministic floating point (which would be a concern for the proposed single player model, though I've not exactly kept up to speed, but less so for less predictive models and multiplayer).

"
ScrotieMcB wrote:

In any case, GGG likely doesn't have the bandwidth to resync every client continuously. Nor would it be a good idea to do so even if they could; you'd have all kind of minor rubberbands happening very frequently, and it's more prudent from a pure gameplay experience perspective to reserve rubberbanding for more drastic desync.


You'll have to excuse me for disagreeing with everything after the first sentence.
Still waiting on GGG response
"Minions of your minions are your minion's minions, not your minions." - Mark
"
"
ScrotieMcB wrote:
In any case, GGG likely doesn't have the bandwidth to resync every client continuously. Nor would it be a good idea to do so even if they could; you'd have all kind of minor rubberbands happening very frequently, and it's more prudent from a pure gameplay experience perspective to reserve rubberbanding for more drastic desync.
You'll have to excuse me for disagreeing with everything after the first sentence.
I used to feel the same way. But — although I've received no dev response on this particular topic — I have had some PM topics. And Rhys convinced me that this is true: that there is such a thing as resyncing too often.

There is an important distinction between desync prevention and desync correction; correction is resync. By correction, I mean rubberbanding; it's the only way to instantly correct the situation. So are many small rubberbands really better than a few large ones? Should monsters be constantly doing these small rubberbands so that their movement looks they're under strobe lights, or should we wait until we have a significant divergence before breaking out the infamous desync teleports? Both are immersion-breaking, but which is the lesser of the two evils?

Personally, I can respect the GGG decision that it's best to hold off on rubberbanding until things get fairly hairy, and try to wait for a less combat-intensive point for correction to invisibly, gradually, resynchronize.

Whichever you choose, you have to admit, it's not really a fix; it's more of a transference. You've sacrificed in one area for an improvement in another.

That's why what's really important it preventing desync in the first place by minimizing those "small" divergences which will butterfly effect into huge problems. This means solid client prediction, which is what this suggestion is all about.
When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted.
"
ScrotieMcB wrote:


There is an important distinction between desync prevention and desync correction; correction is resync. By correction, I mean rubberbanding; it's the only way to instantly correct the situation. So are many small rubberbands really better than a few large ones? Should monsters be constantly doing these small rubberbands so that their movement looks they're under strobe lights, or should we wait until we have a significant divergence before breaking out the infamous desync teleports? Both are immersion-breaking, but which is the lesser of the two evils?

Personally, I can respect the GGG decision that it's best to hold off on rubberbanding until things get fairly hairy, and try to wait for a less combat-intensive point for correction to invisibly, gradually, resynchronize.

Whichever you choose, you have to admit, it's not really a fix; it's more of a transference. You've sacrificed in one area for an improvement in another.

That's why what's really important it preventing desync in the first place by minimizing those "small" divergences which will butterfly effect into huge problems. This means solid client prediction, which is what this suggestion is all about.


I'll agree that for singleplayer you could pretty much get around the issue.

I'm mostly concerned about multiplayer though, so my thoughts are pretty much regarding that.

So the thing is, prediction isn't really correction, it is guessing. You can guess really well for simple things and get it right most of the time (say something like an idealized unbound a+b*t without future external factors), but then you can fail horribly and the longer you try to guess the more you fuck it up.

The point I'm trying to make is that you can't predict for unknown discontinuities, you will inevitably fail.

So the distinction for me has to be made here, there are different forms of desync, they are all error, yes, but they all have different properties.

Typical lag for instance doesn't exactly lie about what happened, it is simply history.

Prediction is almost never, excepting the most trivial situations, provably correct and thus not exactly wholly suitable for correction.

Prediction is still good and useful though, but the reliance on it should always be minimized unless you can make it entirely consistent (as I guess the proposed singleplayer method tries to do).
"
ScrotieMcB wrote:
I used to feel the same way. But — although I've received no dev response on this particular topic — I have had some PM topics. And Rhys convinced me that this is true: that there is such a thing as resyncing too often.


I'm confused about how Rhys convinced you that there is such a thing as resynching too often, without stating all kinds of other conditions and restrictions with that.

Either that is incredibly deceptive (since, you know, we've played high bandwidth low latency games with very little or no prediction and they work quite well), or that's missing a lot of information about cost and policy.
"
"
ScrotieMcB wrote:
I used to feel the same way. But — although I've received no dev response on this particular topic — I have had some PM topics. And Rhys convinced me that this is true: that there is such a thing as resyncing too often.
I'm confused about how Rhys convinced you that there is such a thing as resynching too often, without stating all kinds of other conditions and restrictions with that.
Took very few words on his part; I've gone on far more about it than he has.
When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted.

Report Forum Post

Report Account:

Report Type

Additional Info