POE WILL BE THE NEW VANGUARD IF THEY DONT CHANGE SHIT UP
" Neither the server nor the client are ever completely deterministic, that's a myth computer scientists like to hypothesize about. GGG knew better than to try to make a synchronous client-server system, and it's futile to attempt to retrofit deterministic behavior on a system that wasn't designed that way from the start. When desync occurs, it's not because either client or sever failed to make valid predictions, in an open-loop system they'll eventually just drift apart, and by fiat in PoE, the server's version always prevails. Last edited by RogueMage#7621 on Feb 14, 2014, 4:14:24 PM
|
![]() |
" I don't even... Technically, in isolation, sure, prediction CAN'T fail since it's only executing what it's supposed to. There is absolutely no way you can tell me that in any sort of practical sense the prediction doesn't fail though, this also holds for pretty much every other game ever that has tried to predict things similarly (even with higher frequency updates and predicting for much shorter periods). When people say prediction fails, they are saying that the result of prediction on the client isn't the same as the simulation on the server (which is clearly true). You are completely out of it if you think only the lack of floating point determinism is to blame for this and it is merely drift. The prediction is JUST that, a prediction that doesn't and cannot possibly take into account all variables, known and unknown and always return a unique, correct answer. |
![]() |
" so in short you say that due to technical reasons there are things i should not do, ever, even if there is no apparent reason not to? like using a skill to the full, allowed by the game mechanics extent? so to dumb my own gameplay just because 'it is supposed to desync'? and it is MY FAULT to playing the game and expecting it to work as advertised? is this for real? if people know (And they DO know) that fast-movement skills cause considerable, on demand, desync why ggg introduced leapers and avian corpse-eaters into the game that are 'i desync you to death!' mobs? why knowing that 'doing that hurts' they are doing MORE of that? where is the logic? the only explanation is that they simply refuse to admit problem exists and create the game around mystical Netcode That Works But Sadly Does Not Exist.. |
![]() |
" That's right, the game is full of dysfunctional gem and passive tree combinations that can produce dangerous or undesirable results. Rather than impose spell-timers and hard limits on things like attack and movement rates, GGG allows us to stack things up linearly to volatile extremes. GGG doesn't have the staff to test all the corner cases we figure out how to exploit, they effectively rely on us to flush out the broken combos, and some of the worst they fix. It's up to us to determine the limits of what's useful and what's hazardous, how far you can pump a skill before it backfires on you. PoE is raw with jagged edges, if you want something more polished you know where to turn. |
![]() |
Yeah look, I may hate desync, but I love mechanics/dynamics.
The more naive proposed methods for dealing with desync involving suggestions of dumbing down the mechanics really don't help. Firstly because the system and fundamentals don't actually change when you do this. It is an admission of defeat and forcing simplistic mechanics is really no better than managing complex mechanics. At least you have a choice to a degree. Secondly, I believe mechanical stress like this is actually a good thing when trying to address robustness and to improve things. The more space you have explored in this sense, the better you know the limits and or possible solutions or improvements, at the least it SHOULD aid in your understanding of the problems. |
![]() |
" It's not just floating point discrepancies that cause drift, it's the fact that the game is running open-loop on asynchronous networked servers and clients. Non-deterministic open-loop simulations will always drift out of sync over time, simply because there's no feedback loop to keep them locked in sync. It's not that the predictions "fail" on either client or server, they just diverge non-deterministically over time, up until the server forces the clients to accept its version of the simulation. Real-time tracking systems don't rely on perfect precision to make remote controlled systems work reliably, they use closed-loop feedback mechanisms to keep clients actively in sync. The open-loop behavior of these systems can be crude and over-reactive, just as long as they remain stable and predictable under closed-loop control. PoE's client is well within the ballpark of acceptable open-loop behavior, it just needs a continuous feedback reference signal from the server and a tracking mechanism to keep it locked in sync. Last edited by RogueMage#7621 on Feb 14, 2014, 5:32:32 PM
|
![]() |
If GGG can't solve the problem the smart way (better netcode, better pathing, better game mechanics), then they should at least try to solve it the dumb way (more servers or even a server-side gameplay where the client does nothing until the server aprrove it, like World of Tanks for example).
So or so, I didn't see any real afford from GGG to even try to solve the desync problems. They work on it but this since years. How long can it take to have a decent netcode or just a couple more servers? |
![]() |
" Are we talking about the same thing? " That's the only thing I don't like about GGG. They've flagged some of my posts as hateful, and although most of them were there were some that had no hateful remarks in them, yet they let trolls and flamers fill their forums. |
![]() |
" I'm not in any sort of disagreement about non-deterministic simulations drifting out of sync. I'm not even arguing whether their model is theoretically capable of low enough error to be tolerable, though I will question the parameters and perhaps some of their methods in error measurement and correction (relating to your mention of the continuous feedback and tracking), besides the prediction itself. I do make some distinctions in causes of divergence though and will refer to it as failure in that respect, even if it is all simply error. To me there is a big difference in getting 0.001 or 0.00102 vs applying an impulse with a factor of -1.0 vs 1.0 (say, in the case of a "mispredicted" reflection or projection). Yes, both situations are divergent, but one instantly blows up. Now, make no mistake, I know that minor differences will also cause these significantly divergent situations, but it is definitely rarer than the prediction itself being wrong due to stale or misrepresented state caused by latency, infrequent updates or poor error estimation. What troubles me most is that these obviously heavily divergent effects are rather common. |
![]() |
" First, you're just speculating that some predictions are so divergent that it "instantly blows up". I haven't seen any cases where the client outright malfunctions and fails catastrophically. Second, clients do not try to predict or match what the server is doing, each machine runs its own independent open-loop simulation of the game, receiving both local and remote inputs that vary in latency. They diverge not because of "failed" predictions, but because minor discrepancies in asynchronous open-loop systems inevitably accumulate and diverge over time. This is not due to flaws or "mispredictions", it is because a non-deterministic simulation cannot be precisely replicated by an open-loop system, no matter how precisely it is calibrated. Remote synchronization isn't a mysterious problem, it's been tackled and solved for decades. It doesn't require scrupulous precision and prescient prediction, it just needs a continuous and stable closed-loop feedback mechanism to enable the clients to reliably track the server. Unfortunately, GGG chose to implement a crude fallback mechanism, which sporadically resets the clients when they drift too far, and forces them to abruptly resync to the server's simulation state. It works, but it's a drastic solution that should really only be used in cases of excessive lag or packet loss. Last edited by RogueMage#7621 on Feb 14, 2014, 6:28:22 PM
|
![]() |