Technical solution to eliminate desync in single-player sessions

"
GGG runs a real-time game simulation on its own server because they want to insure security against cheats and hacks and chose not to trust the client in that regard.


They are not secure against cheater, as i proved in this thread with several examples. Furthermore this do not explain why, without a realtime simulation on the server, cheaters would spread anywhere, as it seems you sarcastically claimed. Also i do not understand how the words 'trusting the client' are related to 'realtime simulation on the server'.

"
They made an architectural decision based on their own engineering priorities and business model.


I agree, they have done a great job and a great game, but this not means that it is perfect. And i think that 10 thread/week on desync is an issue that 'should' be addressed. Current architecture cannot solve this problem, unless GGG buy 3/4 server for each continent.



Roma timezone (Italy)
now - I'm no computer programmer, engineer or mathematician...

still Ive enjoyed reading through this whole thread.

what I don't understand, or sorry, what PISSES ME OFF is why people have to be so fcking rude and keep writing stupid post comparing their e-peens. Just because somebody who actually CARES and whats to provide an idea for FREE you should treat them with respect.

Its a forum for gods sake, and people want to discuss a important matter of PoE. I believe its great that Rhys have given some answers and that some people are explaining their ideas - how wacky they might sound for you.

Friggin keyboard warriors...
This is slightly off-topic but, I've been trying to do somet esting on my own and some volunteers that didn't know they were volunteers, but have not turned up and reliable evidence:

Elemental Equilibrium, based on everything that has been divulged and discussed in this thread, how does Elemental Equilibrium in relation to client-serverand/or and multiple client-server interactions work?

Currently I can't figure out how the server can push packets fast enough in a group of 6 people with just one person running EE, and lets say 100 attacks per second. Is there massive packet dropping/collision?

As a follow-up, I then REALLY can't wrap my mind around how multiple EE users one a single target, or multiple targets at the same time, all using different fire/cold/lightning combinations, all also attacking 100 times per second each. How can the server resolve all the incoming packet infor and redistribute it to the clients for verification.

I love the concept of Elemental Equilibrium, but I don't fully grasp if my build is actually benefitting from it in the way I think it does.

I don't mean to add a layer of complexity to your dicussion, but it has beeen bugging me since I first started using it.
"
HellGauss wrote:
"
GGG runs a real-time game simulation on its own server because they want to insure security against cheats and hacks and chose not to trust the client in that regard.


They are not secure against cheater, as i proved in this thread with several examples.

Perhaps by your academic notions of "proof", but in the real world, software engineers work with commercial development tools to build asynchronous systems running on non-deterministic hardware. Your claims about how networked games should operate in a "cristalline a-temporal universe" have not been shown to work reliably in the unpredictable world we actually live in.
"
RogueMage wrote:

Perhaps by your academic notions of "proof", but in the real world, software engineers work with commercial development tools to build asynchronous systems running on non-deterministic hardware. Your claims about how networked games should operate in a "cristalline a-temporal universe" have not been shown to work reliably in the unpredictable world we actually live in.


1)I've proven with some clear example that current system can be 'cheated' with an automated game assistant.

2) So this 2001 paper about determinism in AoE is a fake?
http://www.gamasutra.com/view/feature/3094/1500_archers_on_a_288_network_.php

Edit: didn't read carefully your post, point 2 did not actually apply. However how can some games achieve replay if the game is not defined by itself and by its commands? Do i have to actually play chess to represent a chess battle? Or to realtime-check that moves are legit?
Roma timezone (Italy)
Last edited by HellGauss#6525 on Nov 20, 2013, 5:37:48 PM
"
Shagsbeard wrote:
This thread has gotten quite silly.

I can't bother to check... but has anyone compared GGG to Nazi Germany yet?


No? Just comparisons with US patriot act.


To the topic, desync will always be a problem because of the double check system that i really prefer to other systems. Died to many times in Diablo out of nowhere thx to their system for example. As far as I understand it its a rounding problems with numbers and GGG has to test different methods to round the numbers to remove desync as much as possible for us players.

I am quiet interested in how objects in a world can create desync by default. Or if this is even possible?


And I have to say I was able to adapt a gamestyle that lets me play 99% of the time save and the 1% just happens if i play sloppy. But this is ofc not an option for the majority I understand. I GET IT ! :)
Why you should try Harcore http://www.pathofexile.com/forum/view-thread/209310/page/1
@tipptoe

Great example.

Let's analyze it from a single player prospective:

current implementation:
it is actually a mess. Client does not know what happen, and must ask the server how much damage is dealt. Some of the answer of the server comes back with 5 seconds of delay.

deterministic implementation:
client have all the inputs to perform its own computation and to display what happens. He send commands to the server, which check that everything is fine, even with a few seconds of delay.

Multiplayer:
actually you cannot avoid a little bit of desync. However, with proper design and using GGPO p2p framework [ http://ggpo.net/ ] the deterministic logic can be extended to mutliplayer. This would lead to a 'desync' which is at most the maximum lag in the party, and not the 5 second of PoE server. Again the server could check the integrity of all gameplay with a few seconds of delay.
Roma timezone (Italy)
"
HellGauss wrote:

1)I've proven with some clear example that current system can be 'cheated' with an automated game assistant.

2) So this 2001 paper about determinism in AoE is a fake?
http://www.gamasutra.com/view/feature/3094/1500_archers_on_a_288_network_.php

1) I'm referring to network security, not controller input "cheats".

2) That's what you consider relevant technology? A turn-based RTS whose simulators "must always have the exact same input, and can really only run as fast as the slowest machine can process the communications, render the turn, and send out new commands". There's nothing more to debate, you're simply out of your depth and presuming to speak authoritatively about engineering issues outside of your field of academic expertise.
Better if guys could use speedhack but i wont havedesyncs..pf. Anyway itsl solo game and idc about them so much
"
ScrotieMcB wrote:
Spoiler
"
Mysterial wrote:
Partly right, but actually the problem with those kind of attacks is that the game doesn't couple this with a locational update, which is what pretty much every action game currently available does. In fact, pretty much all movement of any kind for all entities is not synchronized at all unless the server detects something so horrible that it can't possibly be close. (Hence the "I got snapped into another room" issues)
I know you might disagree with this, but the GGG philosophy on this is that short "teleports" break immersion. If the client simulation is known to be divergent from the server gamestate, but is still within a reasonable degree of tolerance, the GGG policy is to avoid automatic resynching on the premise that a teleport is a jarring experience, and the client simulation is close enough to allow the risk of not resynching until/unless the situation diverges even further.

To be honest, I kind of agree with that. It does break immersion, and although I notice a lot of minor desyncs when I'm really paying attention, the game corrects them without needing to teleport me, and if I really cared I'd have /oos on macro.

The main problem I see with this is that, instead of giving the client this information and letting it decide what to do with it (perhaps avoiding teleport), it instead keeps the information to itself. There is a difference between giving the client information and giving the player information, and I see no reason why not to perform the former.
I'm honestly surprised that no one responded with shock and anger. In any case, let me elaborate on this situation more, because it's kind of a paradigm shift for me since the last time I talked at length on the desync problem, and I feel the need to clarify past inaccuracies. Specifically, this regretable quote:
"
ScrotieMcB wrote:
The only way to improve sync is to reduce latency, allowing the client to receive the full game state more often.
What my former position was ignorant of was the difference between prediction and resynchronization. These are completely separate functions.

Imagine if the client had a near-perfect stream of very current gamestate data from the server; in no instance would any information be less than two ping-trips old. This would be the "ideal" situation according to my previous quote, but let's toss in a curve-ball; the client didn't predict anything whatsoever. The result would actually be borderline unplayable, with monsters continuously making short-distance desync teleports (as they are resynched with the new data), players would be resynched the same way (because they'd try to attack those monsters who are in slightly different locations), and the whole thing would gain this choppy animation style instead of smooth, flowing combat. Essentially, it wouldn't be a game without desync, it would be a game with constant desync which is constantly getting fixed every fifth of a second. Not no rubberbanding, but all rubberband, all the time.

Resynchronization is not desynch prevention; it is desynch correction. Proper client prediction of the gamestate is desync prevention. Thus, the primary cause of desync is not netcode per say, but predictive code on the part of the client.

Thus, the focus of correction is on how to correct and when to do so (and the answer isn't necessarily "as soon as any deviation is detected"), and the focus of prediction is on how to put yourself in a good situation so you never even have to ask yourself such questions.

Read this next paragraph. Seriously.

The one idea of qwave's which is genuinely good is the idea of making predictions deterministic through use of a shared random seed, ensuring that both client and server pull the same random numbers in the same order. However, applying these shared random numbers to combat calculations is unnecessary; in most cases, over the duration of a skill animation, the client will have the required time to receive damage/accuracy numbers from the server. Instead, this should be applied to any random processes in monster decisionmaking trees to ensure that monster skill use and overall behavior is deterministic. If a Rhoa has a random chance to charge you, both server and client should have that random number available in advance, thanks to a shared seed, so the Rhoa charge begins on the client just as quickly as a character left-click skill would; the animation should not have to wait for round-trip latency for the server to initiate the charge.

Making monster behavior deterministic, while at the same time pseudorandom, would not be very exploitable to hacking, but it would prevent desync by keeping the client simulation a consistent one-ping-time ahead of the server's true copy of the gamestate. Monsters would be less laggy, more fluid, and combat would seem less disjointed.

In terms of everything else qwave has suggested — swapping prediction and master roles between client and server, byzantine validation schemes, etc. — there isn't much useful to pluck out of it; maybe there is, but I haven't found it yet, and I really do listen.

I sincerely hope GGG reads this post (especially that one paragraph), because unlike most of the netcode-centric ideas to somehow improve desync, this is a rare example of a quality suggestion which can actually prevent it, not just correct it.
When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted.
Last edited by ScrotieMcB#2697 on Nov 20, 2013, 6:57:35 PM

Report Forum Post

Report Account:

Report Type

Additional Info