Originally Posted by MrBungle
it goes like this. CRT pixels need a charge to stay lit, so even if you're looking at a still image you need a certain minimum refresh rate to keep the screen looking smooth. LCD pixels stay lit until they receive a new instruction. so if you're looking at a still image on an LCD monitor 1Hz will look the same as 75Hz.
refresh rate becomes an issue on LCDs when you're looking at motion in video games, and the issue is the same on CRTs.
film runs at 24 fps. telvision video is transmitted at 30 fps, but each frame is doubled, so the TV is actually running at 60 fps. you might conclude from this that 60 fps is perfect for any video game. this isn't true because all film has a motion blur if an image moves too fast to be captured by its frame rate. in video games, each frame is rendered as a perfectly clear image. pause a DVD while a person runs across the screen and the person will appear blurred. pause a video game while a character runs across the screen and you'll see a crisp image. some newer games have motion blur, but this isn't the norm yet.
therefore, if you have fast moving action in a video game, your eyes might notice some jerkiness if you're running at 60Hz, whether you're on a CRT or LCD monitor. most studies conclude that your eyes won't notice any improvement above 72 fps because of the "burn in" effect on our retinas. the human eye has its limits.
again, sorry for no links, but i've had this discussion a million times. i'm sick of looking it all up.
It's all nice on paper...but when I can see the difference...THEN what?
I get headaches at 75Hz...that's over 72...
I used to play with a CRT at 85Hz....and I could tell the difference between 70 FPS, and 90(even if it's capped at 85), because it's a lot smoother.
Sure it's nice in theory...but to a trained eye/an eye that's used to a computer screen...I'm sure it's a whole different story.
PS - Cog, you're my idol. Please, become a teacher or something, and mentor young pretty girls to be like you?