Desync and why it's fixable

"
ScrotieMcB wrote:
It would mean the server designates which player each monster targets; for all decisions beyond that, monster behavior would be deterministic.


Does that include the initial agro? If so, we still have that delay we have now.

If not, targeting alone might not be enough, you might have to resync monster (and pottentially character) position on the client because monster actions depend on their positions/distance from target, and in my previous example, the client that was out of sync would still be out of sync. It would still show that Brutus is attacking it's player, but the location of Brutus would be off.
I don't understand why people are so obsessed with the details... determinism, rubberbanding (which is exactly what PoE does, just at low frequency, vaguely speaking) and so on.

You have a stream of discontinuous state, right. Make it a nice binary sequence. Every bit is a frame representing a variable at a fixed time.

Now, you only sample at certain frames (imagine these are somewhat random sampling intervals). Wtf do you intend to do about not sampling the frames you missed?

You don't/can't REALLY do shit. You can't interpolate (and for the prediction proponent, extrapolate) for this, because the variable is quantized to 0 or 1. There goes pretty much everything.

This is without even adding delay, the longer you delay, well...it really doesn't take a genius to understand divergence or accumulating error and the resulting accumulating cost of delay.

Also, the server is the authority and you are absolutely bound to the delay caused by requiring an internet connection in some way or another, you just can't get around that.

You are simply trying to minimize an error, caused by a combination of prediction and delay (notice how I include prediction here). If you minimize prediction and you minimize delay and you maximize sampling frequency I'm pretty sure you're converging to 0.

"
symban wrote:

@Scrotie:
I thought client already had mob mobility prediction code in its simulation. Because otherwise we would have desynch only for player position being off in reaction to mob/player engagement.

But as much as we have player position desynch, we also have mobs showing to come at you on your client, whereas they stay in their position on server. This generally happens between rooms, around corners or obstacles. And that is the main reason why people find themselves dead 2 rooms away, client predicts mob mobility incorrectly and when player clicks on a mob showing nearby on his client, his char on serverside starts moving toward that mob 2 rooms away. And when it resynchs you find yourself 2 rooms away.

At least that's what I thought was happening, if it is not, do you know how above situation occurs? Because I cant see how the two information with a small time interval from server can contradict with itself.


That was my understanding initially, but it seems it's not quite true. I tested it during the discussion in the other thread, by standing outside the room with mobs, pulling the network cable, and then running into the room. The mobs didn't move, which means they need to receive the attack command from the server.

I think the situation you describe, and which I noticed as well, happens when several mobs are given the command to attack you by walking through narrow passage (like a door). Then the client simulates monster movement to you and, due to slight monster position desync or different monster pathing calculations, monsters on the client manage to get through the passage, while on the server one or more are blocked off by each other and stay in the room. Then if you attack the one that stayed in the room, you get resynced into the room.
"
ScrotieMcB wrote:
Foiling a predictive hack may be difficult, but I feel that the benefit of such a hack is minimal for the malicious user, and that the overall gain in desync reduction outweighs the cost in hackability. (On the other hand, using a seed for terrain is lazy, and GGG should change that system to one which feeds users map data bit by bit over time, making maphack impossible.)


ExFuckingActly. Gameplay > Everything else. Give us a proper game and then be concerned with cheating - no one wants to cheat in a dead game anyway (not saying the game is dead but desync sure destroys a lot of the fun).
"
iamstryker wrote:
If it was easily fixable then it wouldn't still be a problem.


I agree. The question is whether it's technically difficult or just technically difficult for these devs. Seeing as how most every other online game doesn't have the constant desync/lag like PoE I think it seriously calls into question the devs competency. That this was a conscious decision from the beginning doesn't help their case either.

I don't care if some people want to maphack, that doesn't impact my experience at all. However desync seriously affects my enjoyment of the game, and I don't want to quit playing. I want the devs to fix this shit.
@LogoOnPoE: To be honest I'm not really sure how to tackle the multiplayer thing. Maybe the suggestion would work just fine without modifications (compared to the current method), maybe the player-targeting thing, maybe some other modification. Or maybe the suggestion would only apply to single-player. I think the advantages in single-player desync reduction are significant enough to justify adding it to the game. I don't think hashing out every detail of applying the suggestion to multiplayer is required, even if it might be nice.
"
I don't understand why people are so obsessed with the details... determinism, rubberbanding (which is exactly what PoE does, just at low frequency, vaguely speaking) and so on.

You have a stream of discontinuous state, right. Make it a nice binary sequence. Every bit is a frame representing a variable at a fixed time.

Now, you only sample at certain frames (imagine these are somewhat random sampling intervals). Wtf do you intend to do about not sampling the frames you missed?

You don't/can't REALLY do shit. You can't interpolate (and for the prediction proponent, extrapolate) for this, because the variable is quantized to 0 or 1. There goes pretty much everything.

This is without even adding delay, the longer you delay, well...it really doesn't take a genius to understand divergence or accumulating error and the resulting accumulating cost of delay.

Also, the server is the authority and you are absolutely bound to the delay caused by requiring an internet connection in some way or another, you just can't get around that.
This doesn't always apply. Your data needs are: inputs of all players, and knowledge of how monsters will react. In a multiplayer setting, yes, to get the inputs of other players you will have to wait; this is why Logon's inquiries regarding multiplayer raise difficult questions. However, in a single player setting, you don't really need the server to feed you how monsters will react, because monsters are predictable, and you obviously have the inputs of the player immediately. Thus you can free yourself from using a sampling model.
When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted.
Last edited by ScrotieMcB on Nov 27, 2013, 11:11:48 AM
Coding is about the details, on-paper solutions like don't prove anything.

If GGG want to fix desync, the professional way would be to just copy a big competitor's solution :p
(go copy dota2 imo, they have multiplayer, critics, evasion, etc., and no cheats for those rolls. the server rolls all the eva/crit. best reactivity in all the online games i played. the only drawback is that the screen freeze when you have a big lag (this isn't a problem imo))
Chris: "Path of Exile’s economy is the most important element of the game to us".
http://www.pathofexile.com/forum/view-thread/55102
Too bad they don't see how good the ARPG element of their game is...
GGG has said over and over they COULD get desync to mirror other games in the genre but they would have to basically dumb down the mechanics of this game to D3's level in order to hide desync as effectively. That and of course it would make it easier to cheat but the MAIN point here is GGG doesn't want to dumb down their game for an easy fix. they decided to take the path less traveled. For better or worse this is just how the game is gonna play so either accept it or move on.
"
derbefrier wrote:
GGG has said over and over they COULD get desync to mirror other games in the genre but they would have to basically dumb down the mechanics of this game to D3's level in order to hide desync as effectively. That and of course it would make it easier to cheat but the MAIN point here is GGG doesn't want to dumb down their game for an easy fix. they decided to take the path less traveled. For better or worse this is just how the game is gonna play so either accept it or move on.


Whether this is what they want you to hear or what you think you believe when reading. There are ways to fix it. I'm not saying anymore or any less. You can then argue to me from what you read and maybe what you may or may not know on the technical side of things...if you wish.
We don't typically know the costs involved with these "solutions". It's not simply a matter of "would it work?". If they double the amount of communication between the server and client, they'll go out of business. It's that close to the line. They've decided that where we're at is the best they can afford. Let's just be happy they aren't shutting the game down.

Report Forum Post

Report Account:

Report Type

Additional Info