, BitFlipper wrote

*snip*

That is only true if the game follows the old style implementation where it waits for a round trip from the server before displaying the result of a local action. Modern games are good at hiding latency. Here is an example:

Let's say there is 200 ms round trip latency to the gaming server. Also let's say two people are facing each other with shotguns. Whoever shoots first will kill the other one. Now if player A presses the fire button 50ms before player B, then player A should win, right? Well what really happens is that on both screens it looks like both players fired their guns, because the local game responds immediately to the input action (or as fast as possible given the local system delays). Then as far as the server is concerned, player A fired first resulting in player B losing. So even though player B will see their gun fire, he will still lose as if he didn't fire at all.

what you are describing is just prediction by the local client as to what might happen, but to maintain a consistent universe there must still be one arbiter, the server ... Most likely one instance of thenetworked games.