Originally Posted by Z Overlord
Well 30 fps on a 60Hz LCD looks laggy because it's not perfectly synced right? (30/60). That's why movies are interpolated so it doesn't look laggy. Games can't do that or else there'd be input lag.
Please correct me if I'm wrong and explain to me this confusion
Its kinda correct. The problem with fps is that they are never stable. You see, first of all, using the therm "FPS" is nonsense, because what gives you playability is not "how many frames you have in a second" but "what kind of delay you have beween frames".
Think about it for just a second: if we have 60 frames during one second, one thinks that they will be evenly created and, as such, we will have one frame every 16 miliseconds (ms from now on). The problem is that it is never the case: you could ver well have (talking about ms between frames) 5-5-5-5-5-5-5-5...........-200. You could have 60 frames in that same second, but have a huge pause in between a scene...and there your playability goes to the bin.
So, this is the key difference between movies and games. Movies normally go at 24fps (which are doubled for other reasons) because the rate is 100% stable and unless you look for it you will find it totally smooth. Instead, a game playability depends on its engine: Crysis, to say something, is quite playable at low framerates whereas plenty of other titles aren't at 60-70+ fps. And this is why.
I hope I was clear enough. If I wasn't I'll try to repeat in other words.