Reasons Why Desync Happens (Thoroughly Explained /w Cited Commentary On Fixes & Focuses Info)
Further below is a small excerpt of conversation between me and someone else in relation to when I mentioned Quote, "If the software were written well enough, Sync Errors would never happen even with sub-par hardware." End of Quote.
There is also significant information I quoted and critiqued (to be more Developer-friendly) that elaborates more about Desync and what GGG should focus on for a possible Fix. This information can be found in the Significant Commentary On Fixes & Focuses topic line. =================================================== Synopsis of Desync:
Condensed Text
Perhaps this can shine some light on why Desync happens for those who don't know:
‘Sync Errors’ occur in multi-player games because they are attempting to run an identical simulation on multiple machines. That means that a centralized server is not running the simulation, but instead with enforced latency and a queue of all user inputs, multiple clients are running an identical simulation all at the same time. A lot of games do this because it removes the overhead of server-side simulation and upkeep, and it also makes implementing instant-replay pretty easy. Ever notice how you can't get a terminal (crashes the game entirely) sync error in games that have dedicated servers? You can just reconnect and get right back in the action. The downside is that if either of the clients' simulation no longer matches up with the others, the simulation is undefined and cannot continue. "So which version of reality is the right one?" Here's a few reasons that a game can go out of sync: - Random number generation that is not properly deterministic - Floating point representation and math that is not properly standardized - An error in the transport of input information between the clients - ANY case where doing the same thing twice can produce two different results (critical hits?) More than not, you'll notice there is virtually nothing you (players) can do about this. It is the job of the game developers to ensure sync errors do not occur. Unfortunately, this is a hard problem to solve! Upgrading your internet and computer can help, but in the end, if the software were written well enough, Sync Errors would never happen even with sub-par hardware. However, we can list the things which can make those Desync happen more or less often: Your computer can't keep up with the global simulation: It can be because your computer is too slow, or because your ping to the Server is high. In any case, you can't exchange data fast enough with the Server to keep up with the current state. Your connection bandwidth is insufficient: Some games exchange far too much data to keep the clients in sync. If you can't send or receive enough to keep up with all the data necessary, you will Desync. Your connection's quality is poor, and you lose a lot of Packets: If you play on WiFi with a bad range, you are likely to lose Packets. You will lose information the Server sent, or the Server won't get your changes. If the game doesn't handle this correctly, it can lead to Desync. Note that lost Packets can happen even if you have a great connection. It can just happen anywhere between you and the Server. Significant Commentary On Fixes & Focuses: First up, ScrotieMcB's proposal to GGG: Additional bits and pieces of exchanged information, etc. can be found in the commentaries.
Condensed Text
" Some Exchanged Feedback On ScrotieMcB's Proposal to GGG:
Condensed Text
" " " " " " Some Exchanged Feedback On My Synopsis of Desync:
Condensed Text
" " I hope this helps anyone wondering about Desync, and possibly GGG too. HeavyMetalGear When game developers ignore the criticism that would improve their game, the game fails. Just because a game receives a great amount of praise vs. only a small amount of criticism does not mean to call it a day and make a foolish misplaced assumption that it is perfect. (me) Last edited by HeavyMetalGear#2712 on Jul 8, 2013, 3:46:57 PM
|
|
Yet you miss the single most important factor in PoE's desync.
You cannot predict reliably for external, effectively random, discontinuous events. The client and the server do not have access to the same data at the same time. PoE does a LOT of prediction and it fails horribly because of solver/integration divergence. |
|
Packet loss
"I'm programmed to say something that is kind and uplifting at this point, but there is apparently an error that is working in my favor."
|
|
Can anyone confirm what I heard about packets that involve movement and position being TCP? That may have a lot to do with it too.
I haven't worked on any online game, but I thought doing those with UDP was pretty much mandatory to avoid stuff like this. |
|
TCP/IP has more overhead (more latency/bandwidth) but is guaranteed to be reliable. UDP is not reliable by default/design and is typically used when low latency is a bigger issue than reliable packets and low bandwidth.
|
|
Every time Onyxia uses Deep Breath PoE desyncs. Didn't you know that?
IGN - PlutoChthon, Talvathir
|
|
What GGG Really Should Be Focusing On With Desync
1. One-way UDP for temporary things, two-way TCP for commands and permanent changes Right now Path of Exile using TCP for everything. Read this: UDP vs. TCP. That author goes a little too far, since TCP would still be smart for certain things (player mouse clicks and keyboard presses, inventory management), and in today's world we should assume all routers are already using TCP in some form at all times (cells sharing WiFi)... but the enthusiasm is good. The point would be to use TCP's more bulky, failsafe delivery for Client commands (move to x,y; attack Monster06) and permanent changes (player max life/mana/ES)... but stream the data which changes continuously (current life/mana/ES, monster/player/projectile positioning, maybe animation progress). After a few seconds, this temporary data is useless anyway, so using TCP's bulky guaranteed delivery is pointless, by the time it's resent it's too old to matter. 2. True packet information entropy (both brevity and compression) Right now I doubt GGG has taken efforts to make packets compact, and is instead focusing on making them readable by humans, which greatly increases bandwidth consumption. This isn't exactly non-standard for the industry; for example, the standard HTTP header* that comes attached to web pages you view looks something like this (which would be shrunk about 15% with lossless compression):
Example HTTP header
GET /tutorials/other/top-20-mysql-best-practices/ HTTP/1.1
Host: net.tutsplus.com User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5 (.NET CLR 3.5.30729) Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 300 Connection: keep-alive Cookie: PHPSESSID=r2t5uvjq435r4q7ib3vtdjq120 Pragma: no-cache Cache-Control: no-cache
Brief HTTP header
GT /tutorials/other/top-20-mysql-best-practices/ H/1.1
Ho: net.tutsplus.com UA: Mo5.0 (Win; U; WinNT 6.1; en-US; rv:1.9.1.5) Gko/20091102 FF/3.5.5 (.NET CLR 3.5.30729) At: t/html,a/xhtml+xml,a/xml;q=0.9,*/*;q=0.8 AL: en-us,en;q=0.5 AE: gz,def AC: ISO-8859-1,utf-8;q=0.7,*;q=0.7 KA: 300 Cn: k-alv Ck: PHPSESSID=r2t5uvjq435r4q7ib3vtdjq120 Pr: no-c CC: no Reducing the size of packets using these methods would help with bandwidth issues (both for GGG and the players who have them) and also help ease the packet flow issues between TCP and UDP that where mentioned by the author in the TCP vs UDP article. * Although I've run packet capture on PoE before (to determine that PoE is 100% TCP), I haven't decrypted the packet contents, so I'm using this as a blind example. It's very hard to tell what GGG is doing on this front. 3. Smart sync rates Players in town need sync information a lot less often than players in the middle of combat with rhoas. The amount of anticipated combat stress on a character should guide how often synchronization packets are sent out, allowing for more sync attempts in heated situations at the expense of situations where they are not needed, allowing better performance with the same total server bandwidth. When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted. Last edited by ScrotieMcB#2697 on Aug 10, 2013, 12:36:01 AM
|
|
" " Truly you are wise beyond our understanding. | |
Good lord people, stop making it more complicated than it really is.
MrMrMr nailed it. Desync cannot be "solved" without a trusted client, something GGG would be 150000% foolish to do. Here's a concrete example: Assume that at time T, client and server are 100% in sync. At that exact time, you attack a mob that is running past you. The client calculates (correctly) that you attack fast enough to hit that mob, and you're close enough to hit them mob. So it does not try to issue any movement commands, plays an attack animation, and sends the "player attacked mob #12398" to the server. This information gets to the server at T+100ms. By that time, the mob has moved *just* far enough so that its out of range of your melee attack. If you were holding shift down, you just miss. If you weren't holding shift down, the server interprets this as wanting to chase down the mob and starts moving you towards it. Either way, the simulation is now out of sync (de-synced). This is why some people ALWAYS attack with shift down -- without it, desync can turn what YOU think is "attack that mob" into "chase that mob into the next room and die". This is only one situation that causes desync, there's plenty more (stun!) I HATE posts insinuating that desync is just something that GGG can magically fix with better coding. They can mitigate it somewhat, but the magnitude of coding wizardry that needs to happen behind the scenes to do this without compromising simulation integrity is not to be underestimated. HeavyMetalGear, I realize you mean well. But please don't make statements like this: " Scrotie -- as usual -- is bang-on. p.s. in before some smartypants mentions that I am technically incorrect, in that the problem could be solved with sufficient memory and computing power -- "all" that they'd have to do is keep the last second or two of simulation 'slices' (lets say one per 10ms), receive commands like "player attacked mob #12398 at time T", back up the simulation to time T, apply the command at that point, and then very quickly re-run the sim to catch it back up to the current time. Even then, that would only work for single-player. Have modem, will travel! Silas' Gear & Gems: http://www.pathofexile.com/forum/view-thread/426367 Last edited by SilasOfBorg#5058 on Jun 15, 2013, 11:02:15 AM
|
|
I have always wondered why Diablo 3 has absolutely no desync, especially with the latest patch, the screen is always full of monsters, not a inch of desync, never. Why not you POE ? You deserve it more. |
|