Originally Posted by Ch13f121
If done right, prediction can level the playing field between ping times.
Remember how people used to gripe because Gears of War gave you an edge based on how close you were to the host box? GoW doesn't have prediction, it's client side hit detection.
So therefore people with near zero pings were able to shoot and hit what they're aiming at, and people at a distance were unable to do so, having to lead their targets. It pretty much ruined the game. I think they fixed it in GoW 2 or 3.
Timegate's Section 8 had this problem as well. You could be right in front of a guy and would have to lead him if your ping was higher than his.
Source actually reads back and forth a certain amount of frames to determine if a bullet hits or not. It's not client vs client, it's client vs server, and if the server says you hit, you hit. Centralized hit detection at the server is a lot more fair for everyone. It's not perfect, and given the craziness of network connections I don't think you'll ever have rock solid net code, but you can try to make it as fair as possible.
EDIT: If you have 150+ ping on a server then you really shouldn't be on that server in Source. Things get really...weird.
What you are referring to is interpolation. The advantage you are referring to when it comes to client vs server is interpolation and tick rate. Where the number of packets is sent to the server per frame, that is where the server determines who did what first(which is another issue with source, but save that for another time).
Entity interpolation causes a constant view "lag" of 100 milliseconds by default (cl_interp 0.1), even if you're playing on a listenserver (server and client on the same machine). This doesn't mean you have to lead your aiming when shooting at other players since the server-side lag compensation knows about client entity interpolation and corrects this error.
Prediction is the notion of the client predicting the effects of the local player's actions without waiting for the server to confirm them. An entity's predicted state is tested against server commands as they arrive until either a match or a mis-match is detected.
In the vast majority of cases the client's prediction is confirmed by the server and it continues happily as if there was no latency. If there is a mis-match, which is rare if the prediction code is written correctly, then the client goes back and re-simulates all of the commands it ran with bad data. Depending on the severity of the error this can cause a noticeable hitch in the player's position and state, and possibly the state of the world too.
The bolded part is a describes perfectly what the YouTube video showcases in my other post.
Prediction in a sense is like a checks and balance. You send an input, the server verifies and send a packet back, if there's a mismatch then there is an error and the client-side will have to correct the error and resend an input. That is why in some cases you get rubber banding even with decent pings.
Then you are introduced with lag compensation which is more closely related to packets and prediction.
At the end of the day though, I agree with you, not all netcode will be perfect... for now, however where we disagree is Source engine IMO is far from being "good" or even acceptable. I play CS:GO competitively and I deal with it's crappy phenomena day in day out, so it's quite frustrating at times.
https://developer.valvesoftware.com/wiki/InterpolationEdited by ABeta - 6/6/13 at 10:27am