Overclock.net banner
21 - 40 of 45 Posts
If you want to test your monitor and your eyes, open coutner-strike source, type in console fps_max 20.
Play for 5 minutes.
Then increase by jumps of 20 fps.

At around 100-120 you should no longer see such a big difference from the first few bumps.

60 is the minimum for FPS's. 90 will be smoother, 100-120 will be the max smooth you'll have unless you grab a 240hz+ monitor and yet you won't notice much more smoothness. You will notice a difference but it won't be like going from 20 to 200 fps.
 
You don't need a 120hz monitor to notice the difference in game-play from 60fps to 100 fps. I play CS 1.6 and the input lag is very noticeable at 60 fps, even tough mi screen has a 75hz@640x480 refresh rate and my eyes only notice 37. As someone above said, at 100fps you'll have a smoother experience (mostly in fps).
 
Quote:
Originally Posted by Sony Oengui View Post

You don't need a 120hz monitor to notice the difference in game-play from 60fps to 100 fps. I play CS 1.6 and the input lag is very noticeable at 60 fps, even tough mi screen has a 75hz@640x480 refresh rate and my eyes only notice 37. As someone above said, at 100fps you'll have a smoother experience (mostly in fps).
Input lag has nothing to do with framerate. It has to do with v-sync. With v-sync disabled you will get the same input lag at 60 fps as you do at 100 fps. The only reason the game is jumping to 100 fps and you are getting reduced input lag is because you disabled v-sync.

Disable v-sync and cap the framerate to 60 fps if the game allows it. Anything beyond 60 fps on a 60Hz monitor is wasted resources. All it does is make the video card work harder without any visual difference. Anyone who claims to see a visual difference between 60 fps and 100 fps on a 60Hz monitor is just plain wrong because it's not physically possible. That's like saying a 60 gig drive holds 100 gigs of data... (don't start talking about compression, you know what I mean).

Since you have a 75Hz monitor, then 75 fps should be the cap. If you can't cap it and you get input lag then you either have to waste resources to reduce input lag by disabling v-sync, or keep v-sync enabled and play with the input lag.
 
You guys need to separate input lag from actual monitor display. If your monitor is 60 Hz and vsync, more than 60 fps won't help make the display smoother. You actually get tearing.

However, no vsync will make your mouse movement feel more responsive with less input lag.
 
Discussion starter · #25 ·
Quote:
Originally Posted by Phantom_Dave View Post

Input lag has nothing to do with framerate. It has to do with v-sync. With v-sync disabled you will get the same input lag at 60 fps as you do at 100 fps. The only reason the game is jumping to 100 fps and you are getting reduced input lag is because you disabled v-sync.
Disable v-sync and cap the framerate to 60 fps if the game allows it. Anything beyond 60 fps on a 60Hz monitor is wasted resources. All it does is make the video card work harder without any visual difference. Anyone who claims to see a visual difference between 60 fps and 100 fps on a 60Hz monitor is just plain wrong because it's not physically possible. That's like saying a 60 gig drive holds 100 gigs of data... (don't start talking about compression, you know what I mean).
Since you have a 75Hz monitor, then 75 fps should be the cap. If you can't cap it and you get input lag then you either have to waste resources to reduce input lag by disabling v-sync, or keep v-sync enabled and play with the input lag.
Wow ok thanks, that really explains a lot. I didn't know what is input lag before, and what you just explained is exactly what is happening to me. So i guess my problem is input lag.
but how can I keep my FPS at 60 and not get input lag? I have to disable V-sync but that will make my FPS goes all over the place and stress my GPU and CPU.
Quote:
Originally Posted by Riou View Post

You guys need to separate input lag from actual monitor display. If your monitor is 60 Hz and vsync, more than 60 fps won't help make the display smoother. You actually get tearing.
However, no vsync will make your mouse movement feel more responsive with less input lag.
true thanks.
 
Probly yes

Sent from my Nexus S 4G using Tapatalk
 
For Nvidia, set tripple buffering to True and use 0 or 1 frame ahead rendering. Test it out, works for most FPS games I play where I need v-sync because of lack of FPS limiter.
 
Quote:
Originally Posted by Aparition View Post

For Nvidia, set tripple buffering to True and use 0 or 1 frame ahead rendering. Test it out, works for most FPS games I play where I need v-sync because of lack of FPS limiter.
Same, I force Vsync on in the nvidia panel and set triple buffering, never noticed any input lag.
 
Discussion starter · #29 ·
Quote:
Originally Posted by Aparition View Post

For Nvidia, set tripple buffering to True and use 0 or 1 frame ahead rendering. Test it out, works for most FPS games I play where I need v-sync because of lack of FPS limiter.
Quote:
Originally Posted by Xeio View Post

Same, I force Vsync on in the nvidia panel and set triple buffering, never noticed any input lag.
I have an AMD card, I went to CCC and enabled triple buffering but I don't think it works for DirectX because it says above it OpenGL Settings. anyways I'll run BF3 and test it out.
 
I get horrible input lag in CS:S at 60hz V-SYNC however at 120hz Vsync there is very minimal input lag. I'm talking micro scopic. I still notice it feels a little off, but its definitely not the same off that I experience with 60hz and Vsync
 
Not sure with AMD cards as I have only used Nvidia cards for the past few generations of cards. Here is how to force triple buffering.

http://www.tweakguides.com/NVFORCE_9.html

See the 2nd page of that guide for an easier method using RivaTuner to force Triple Buffering in games. Do note triple buffering uses a lot of VRAM.
 
I think how many "frames" (in quotations because the human eye doesn't really see frames) depends on the persons eyes. I think more frames are important in FPSers because, say you're turning, it in itself is a fast paced onscreen movement, the higher the frames the more game information you see during the turn. I personally usually use VSYNC because I play a lot on my 37" Vizio and I'd rather not have noticeable tearing. Although for some games I turn it off. For example, I turn it off in BF3 mostly because I'm usually at the front and the action is tense enough that I don't notice tearing a great deal. However, for games like "Dear Esther", I do turn it on because the game has a much more slower pace than BF3.
 
I have a 60 Hz monitor, and I always turn Vsync off. At 60 FPS, there is always an input lag, but when it's over 75, there is none. And screen tearing doesn't bother me, so I don't need to waste any more money for 120 Hz monitors.
thumb.gif
 
Quote:
Originally Posted by drbaltazar View Post

Human cant detect past 60 hertz.so it is very likly the culprit. is a deep program like .net or other stuff .i have been using .net 4 since it was avail and ms found bug becausr someones systeme reported bugs. i thot these went straight to trash but it looks like they are reverse engineering bugs now a day since the two fix everything is way smoother.on average tho i recommand a clean install and dont use tweak ms doesnt speak of.if they dont mention a server tweak for w7 ods are you would lose.if you cant get info on w7 about an xp tweak dont apply it.ask on channel q&a show with gov maraj these guy eat technology for breakfast.
The eye doesn't work that way man. It doesn't see in fps.
Sent from my Galaxy Nexus using Tapatalk
 
Quote:
Originally Posted by mwayne5 View Post

I think how many "frames" (in quotations because the human eye doesn't really see frames) depends on the persons eyes. I think more frames are important in FPSers because, say you're turning, it in itself is a fast paced onscreen movement, the higher the frames the more game information you see during the turn. I personally usually use VSYNC because I play a lot on my 37" Vizio and I'd rather not have noticeable tearing. Although for some games I turn it off. For example, I turn it off in BF3 mostly because I'm usually at the front and the action is tense enough that I don't notice tearing a great deal. However, for games like "Dear Esther", I do turn it on because the game has a much more slower pace than BF3.
thinking.gif
I would say the same thing, but the human eye technically sends electrical impulses to the brain. So wouldn't it really be sending single images at a time (frames) but at an extremely high rate that the brain interpolates into fluid movement?
 
Quote:
Originally Posted by PappaSmurfsHarem View Post

thinking.gif
I would say the same thing, but the human eye technically sends electrical impulses to the brain. So wouldn't it really be sending single images at a time (frames) but at an extremely high rate that the brain interpolates into fluid movement?
Yes and no, but also the Brain interprets images and creates "frames" which we understand. Ever think you saw something or just not "see" something directly in front of you? The eyes create those electrical impulses which are more of a stream of data.

What the whole FPS argument is rather that the human eye cannot tell the difference a 60Hz monitor outputs give or take an argument.
 
Quote:
Originally Posted by Fantasy View Post

Wow ok thanks, that really explains a lot. I didn't know what is input lag before, and what you just explained is exactly what is happening to me. So i guess my problem is input lag.
but how can I keep my FPS at 60 and not get input lag? I have to disable V-sync but that will make my FPS goes all over the place and stress my GPU and CPU.
true thanks.
The only ways I know of are what people have already posted here and some games will allow you to set the maximum fps. Beyond that I have no idea. It's never been a problem for me.

As for the debates over how much the human eye can actually see, take a look at this thread. (someone posted this in another thread here)

It seems that nobody really knows what the limit of human vision is. But it appears to go well beyond 120Hz if you want to look at it from that perspective. Even I can still see a little bit of choppiness at 120Hz in games. If it was truly undetectable for the human eye to see past 120Hz then it would look fluid. Correct?

This is from a comment in that thread, but it covers a lot. You'll likely not read it all. I didn't.
wink.gif


Edit: Here's the source for the comment with the missing media:
http://www.100fps.com/how_many_frames_can_humans_see.htm
Quote:
How many frames per second can the human eye see?

This is a tricky question. And much confusion about it is related to the fact, that this question is NOT the same as:
How many frames per second do I have to have to make motions look fluid?

And it's not the same as
How many frames per second makes the movie stop flickering?

And it's not the same as
What is the shortest frame a human eye would notice?

Test 1: Smoothness of motion

Imagine yourself watching movie of an unbelievably slow fog. You don't see edges and sharp borders. Now play the movie with 10fps. It will look fluid. Why? Because the difference from one frame to the other is very low. The extreme would be a totally unmoving wall: Then 1 fps would equal 1000 fps.

Now take your hand and move it slowly in front of your face. Then move it faster until it's blurry. How many frames per second do you see? It must be little, because you see only a blurred hand without being able to distinguish every change per millisecond, but it must be many, because you see a fluid motion without any interruption or jump. So this is the eye's trick in both examples: Blurring simulates fluidity, sharpness simulates stuttering. (It's similar to "rotation simulates gravity".)

Motion blur example1: Capture from a live performance of The Corrs "What can I do" at MTV Unpluged

Motion blur example2: Capture from "Basic Instinct", where you see a woman plunging an ice pick into a man's body while sitting on him.

The fact is that the human eye perceives the typical cinema film motion as being fluid at about 18fps, because of its blurring.

If you could see your moving hand very clear and crisp, then your eye needed to make more snapshots of it to make it look fluid. If you had a movie with 50 very sharp and crisp images per second, your eye would make out lots of details from time to time and you had the feeling, that the movie is stuttering.

Also 25fps but without motion blur: Footage from BBC's story about Ed Gein, the murderer, who's case inspired Hitchcock to make "Psycho" and Jonathan Demme to make "Silence of the Lambs". The music is from CNN's "Marketmakers" (0.52 MB).

Just think of modern games: Have you ever played Quake with 18fps? There is no motion blur in those games, thus you need a lot of frames per second more.

However, you see the spots and the dirt of single frames in a cinema film, don't you? And those movies are played at 24fps. So there is a difference between seeing motions fluid and seeing that there's something (dirt) at all. Read on.

Test 2: Sensitivity to darkness

Imagine you look at a shining white wall. Now this wall turns totally black for 1/25th of a second. Would you notice it? You surely would. 1/50th of a second, well maybe harder. 1/100th of a second? Very difficult. Think of your 100Hz TV sets. They are called flickerfree, because at flicker rates of 100 times per second you stop to notice the blackness of the TV screen, though the TV screen isn't shining all the time, but pulsating 100 times per second. Brightness eats darkness.

Take again "Test 1: Smoothness of motion". You have a fluid film with 24 fps. The film roll has to roll thru the projector. To not see it rolling you have to make the picture black while the film rolls on. You would have to blacken the screen 24 times per second. But 24 black moments are too visible. Thus you have smooth motions but flicker.
The solution is: Show each frame 3 times and make the screen black 3 times per frame. This makes the black moments shorter and more frequent: "Triple the refresh rate". So you see about 72fps in the cinema, where 3 consecutive frames are the same. Strange solution? Solution of an analog world. And an example how "Brightness eats darkness".

Test 3: Sensitivity to brightness

Let's do the opposite test to "Sensitivity to darkness". Let's talk about, how sensitive the eye is to brightness.

Imagine yourself in a very dark room. You have been there for hours and it's totally black. Now light flashes right in front of you. Let's say as bright as the sun. Would you see it, when it's only 1/25th of a second? You surely would. 1/100th of a second? Yes. 1/200th of a second? Yes. Tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second.

That is identifying. So it's pretty safe to say, that recognizing, that SOME light was there is possible with 1/300th of a second. Now if you take into consideration, that you have two eyes with different angles and different areas of sensitivity (you probably know, that you see TV flickering best, when you don't look directly into the TV screen, but with the sides of your eyes) and you can move/rotate/shake your head and your eyes to a different position, you probably needed flashes as short as 1/500th of second to make sure, nobody sees them in any case.

Now, what happens if I flashed you 1/500th of a second once in a second for 365 days directly into your eye? Would you feel something strange? Would it feel different than without it? Would you notice that something is wrong?

So, we should add a security value, to make sure nobody sees ANYTHING even unconsciously and feels comfortable about it.

Maybe the industry didn't add enough security factor to CDs and that's why many people still feel that analog is sometimes better. It's like in a room full of neon lights. You just know that something isn't right.

The reasons for the results of Test 2 and Test 3 are afterimages. Bright light creates an afterimage in the eye. The same way you see light in your eye seconds AFTER the doctor shined a light into it. This afterlight makes it possible to see what was there seconds ago. The brightness of the afterimage of the cinema canvas produces such afterimages and thus helps the movie to be flickerfree.

So the question "How many frames do I need to make the movie flickerfree" = to not see the blackness between the frames (about 70-100 fps) doesn't answer the question "How short can a bright image be to see it?" = the Airforce question and this doesn't answer the question "How short can a (not bright) image be to see it?".

So the conclusion is: To make movies/Virtual Reality perfect, you'd have to know what you want. To have a perfect illusion of everything that can flash, blink and move you shouldn't go below 500 fps.

Think of that, too

If your screen refreshes at 85Hz and your game runs at 50Hz (=50fps): Are you sure that you don't need to synchronize them? Are you sure, you don't need to play with a multiple of 85 to enjoy synchronized refresh updates? So the game running at 85fps may better than at 100fps. Maybe even a TFT display was better. It displays only with about 40fps but progressively.

Even though single eye cells (rods and cones) may have their limitations due to their chemical reaction times and due to the distance to the brain, you cannot be sure how they interact or complement or synchronize. If 1 cell is able to perceive 10fps, 2 cells may be able to perceive 20fps by complementing one another. So don't confuse "The human eye" with "The cell".

Some eye cells are reacting only when a stimulus is moving. Some react when it's moving from A to B, some when it's moving from D to Z. This may complicate frame-based simulation of reality.

Motion of your body could alter the way how you perceive. Do you get headaches after watching 3 movies in the cinema in a row? Maybe that's because you didn't move with the filmed motion? This is the reason for front-passengers' indispositions (= somebody else moved the car) and seasickness (=the sea moved the ship suddenly). Maybe this is the reason why 3D gaming glasses will never work perfectly. And this has nothing to do with frame rates.

When you look straight (= with the center of your eyes) it's not the same as if it was with the sides of your eyes. The sides are more sensitive to brightness and to flickering.
Sensitivity to blue is different than to green: You see green best, even when it's dark, e.g. leaves in a forest at night. So "blue frames per second" may differ from "green frames per second"

Do you like to play Quake? Do you think "More is better"? Maybe that's why you think 200fps is better than 180fps.
Do you think moving in 3D games is stuttering? Maybe your mouse scans motion with too little dpi (Dots Per Inch) or fps (Frames Per Second)?
 
Discussion starter · #39 ·
yah people really piss me off when they say human eye can't see beyond 37FPS or 60FPS or what ever FPS. That is so not true. Its just an excuse for console player for them not having to play most games at 60FPS.
 
Quote:
Originally Posted by Onions View Post

i thought they eye cannot see anyhting over like 36 or 37 fps.... i personaly dont notice a diffrence between 40 fps and 120 fps...
Same here... I've always wondering what's so great about 100fps+ which most ppl want to achieve. It's either for their e-peen or my eyes just suck. However, I do think I have a great sight. But
anything at 40-50 fps it's great for me since beyond that I just don't see any improvement in anything.

To ^ post. I really don't see any difference yet I do try to have fps 60+ since every wants to be there
smile.gif
 
21 - 40 of 45 Posts