Overclock.net › Forums › Components › Monitors and Displays › Monitors and FPS
New Posts  All Forums:Forum Nav:

Monitors and FPS - Page 4

post #31 of 34
Important note: I was talking for VSYNC ON rather than VSYNC OFF when I was comparing the frame-repeats of 2,2,2,2,2,2 versus 1,3,2,1,3,2. Original post now edited
Quote:
Originally Posted by Mark Rejhon 
2. It sometimes stutters more for VSYNC ON. Instead of repeating frames 2,2,2,2,2,2x each, it can repeat in a sequence like 3,1,2,3,1,2x. More judder.
This is the corrected line, to point out VSYNC ON.
Quote:
Originally Posted by fateswarm View Post

In my practical viewing experience, higher Hz is always better not because it won't judder, it will, but I think the fragmentation will be done in more pieces.
Correct. If the framerate is higher.
That's also what I was saying -- more tearlines, but each tear-lines have smaller offsets.

You have one error. You meant "higher framerate", not "higher Hz", because the number of fragments is more proportional to framerate, NOT to the Hz. (If you're counting the number of fragments as the same way as number of tear lines). There's always X tearlines per second for a specific X frames per second. Sometimes, occasionally the tearline is hidden offscreen (off the bottom edge of screen -- i.e. occured in the blanking interval by chance), but as a rule of thumb, there is always never more tearlines per second than frames per second.

An important consideration: People DO NOT notice fragments. But they DO notice the tearlines between frame fragments. Higher Hz means tearlines are shown for shorter time periods. (e.g. 1/120second instead of 1/60second). Higher framerate tearlines (for horizontal motion) are smaller horizontal offsets.
Quote:
To explain it even more practically, out of sync on a 60 Hz might look for a certain occasion like being cut in 2 pieces, the top part being far from what the bottom part is doing.
Yes, and there's a tearline in the middle.
Quote:
But in higher Hz, the pieces being fragmented will be more and it will look more smooth, even if of course, still not perfect.
Not higher Hz.
Higher fps.
Quote:
Hence I firmly believe the higher the Hz the better on any FPS and hence the reason why with CRTs nobody vsynced.
Actually, I still VSYNC'd on CRT's for solo gaming, since it looked much better. I remember reading posts on FidoNet and Usenet in the 1990's that people still hated tearing playing GLQuake (3dfx Voodoo days) and the VSYNC ON-vs-OFF wars was still happening back in those days.
Quote:
edit: Of course, all this applies strictly without vsync.
I agree.
Quote:
edit: Also I think you have it wrong about duplication there. On vsync, it might duplicate 60 FPS on 120 Hz
No, I wasn't -- I was talking specifically for VSYNC ON when I was talking about full-frame repeats....And you agree I was correct, here.
Quote:
but when it gets to not using vsync, it does not sync at all!
...And I agree, with one modification to one of the errors you have:
Quote:
And with higher Hz the more the 'fragments' and hence the higher chances for a smoother image.
Number of fragments is proportional to frames per second, rather than proportional to Hz. VSYNC OFF 180fps@60hz still produces 180 frame fragments (average of three spliced-together frames, per refresh), and 180fps@120Hz still produces 180 frame fragments (average of 1.5 spliced-together frames, per refresh). (Note that some frame fragments will overlap adjacent frames -- the same frame would cover the the bottom part of the previous refresh and the top part of the next refresh -- I consider this one fragment, as it's associated with one frame and one tearline -- the defect "thing" that the human eye sees).

Note that tearlines can also reoccur in the same location between multiple frames (e.g. if you fps_max at 120, you might have a nearly-stationary or slowly-moving tearline position, making the tearline extremely noticeable and annoying. The 120 frames per second at 120 Hz, can result in 120 tearlines per second that roughly positions itself in approximately the same position in each frame. Since the fps_max is not perfectly synchronized with the refresh rate (which may actually be more like 119.9Hz or 120.1Hz), the tearline actually "rolls" upwards or downwards. If you use Adaptive VSYNC, the drivers works to keep the tearline off the edge of the screen and tries to keep it synchronized with the blanking interval (like VSYNC ON), unless a frame takes longer to render, then the next tearline that occurs, will likely be in the visible frame area (like VSYNC OFF).

.........

Summary:
-- People DO NOT notice fragments directly. People DO notice the tearlines. Let's focus on those.
-- Number of tearlines isn't dependant on refresh rate. It's dependant on frame rate.
-- Number of tearlines is the same as number of frames. Higher framerate, more tearlines (but harder-to-see each)
-- Higher Hz means tearlines are shown for shorter time periods. (e.g. 1/120second instead of 1/60second). That's why they're harder to see at higher Hz, they're shown more briefly. Even if framerate is the same (e.g. tearing is less visible at 60fps@120hz versus 60fps@60Hz).
-- Higher framerate tearlines (for horizontal motion) are smaller horizontal offsets. Once your framerate is sufficiently high enough, they're harder to see at higher frame rates, they're smaller horizontal offsets, and beyond a certain (insane framerate) point, they stop being noticeable.

If there are further disagreements with this -- I will begin creating a Blur Busters Blog entry with some graphics that illustrates the principles.
Edited by mdrejhon - 6/17/13 at 6:56am
post #32 of 34
Quote:
Originally Posted by mdrejhon View Post

Skylit
[snip]
Very interesting analysis about computer mice.

I wonder if there is a mouse analysis program to test mouse accuracy.
It'd say... "Please move the mouse from left to right very slowly but as consistently as possible, for 5 seconds" .... And the software would analyze the mouse movement consistency (jerkiness in position updates). You'd do maybe 10 or 20 runs, and at various different settings, and it'd generate graphs, to show how much the mouse fluctuated. And see which DPI settings ended up being the most accurate. The error averages would cause specific DPI settings to stand out.

One thing I want to see is that a very good (proper) 500Hz or 1000Hz mouse -- whether increased mouse sensitivity mode (4x higher DPI) but when mathematically divided by 4, still results in similiar accuracy to the original one-quarter DPI setting. Increased DPI would amplify errors, but if it's done properly in a good mouse with good math, drivers, game logic, the error amplification is only proportional to the DPI increase, so that you retain original error (rather than increasing signal-to-noise). In this case, the extra DPI is far more likely to become bonus (frosting).
In really bad situations (bad mice or bad math, etc), the errors amplify faster than the DPI amplification (e.g. errors become 3x during 2x DPI, etc).

Are there any mouse-movement analysis programs such as this, that I could try out?


About program, there isn't really anything out there.

There is a program that measures IPS tracking speed, but is subjected to inconsistency and limited to a ton of possible variables. Better yet, if one wanted, they could trick the programming.

This is already done to an extent. Current hardware is rather limited in terms of pixel by pixel coverage. One of the major designs (used across 3 companies; 2 exited and transfered/licensed/sold IP) has been updated and in circulation since 2003. DPI was increased by cutting and multiplying the native matrix.**

In recent times, what's perceived as "smoothing" has become semi noticeable (Semi because it really depends on hardware combinations and latency as we were discussing above). It might not be smoothing entirely per some design choices, but the sensors aren't raw as older designs of said architecture. The same design used to have another algoritm called angle snapping which had became a popular dislike among gaming crowds. If given the choice, I do think I would choose the snapping version for raw response, but maybe i'm a little hardcore smile.gif

This also happens on MCU level with a variety of mice. Where lets say an 6400 DPI mice can be divided down to 800; basic interpolation.

What's become popular talk among gaming circles is "native" resolution. I feel I've played a significant role on educating this concept to others and with that collectively discovered (and with friends that play on a high/professional level) that interpolating resolutions creates a delayed, but not delayed feel. It just doesn't feel "right" if that makes sense. I have a theory that this can altered or perceived differently if a fast ARM cortex controller was used.

I might be confusing you as there is both the native array and native resolution counts inherit to read only memory of a specific design. Technically, expanding beyond the default array shouldn't be considered native, but the delay issue with MCU recalculation comes to play. "Native counts" is widely understood among engineers in this industry regardless.

With that said, all of this is also technically meaningless. I mean, I'm positive such technology can be perceived in other light, rather than whats been established for so long. Delay vs CPI for an example.

** https://www.youtube.com/watch?v=lc7JVjcPzL0 Logitech engineer explaining many concepts for their "science wins" campaign.

There's a lot more he could have covered or at least I wanted him to cover stuff I actually have no knowledge of. frown.gif
Edited by Skylit - 6/17/13 at 7:28am
post #33 of 34
Quote:
Originally Posted by dihartnell View Post

Im no expert but yeah that's how I understood it. The monitor refresh limits the number of frames per second that are actually displayed.
Not whole frames, no.

However, with VSYNC OFF, fragments of each frame are displayed. For example, at 180 frames per second, fragments from three different frames can be 'spliced' together in the same refresh, with two or three tearlines being displayed during the same refresh. (There can be more than one tearline per refresh, if two frames renders by the GPU in less time than the time of one monitor refresh). That's how 180 frames can actually display itself (partially) on a 60Hz monitor -- fragments of frames -- and the "tearline" (skew artifact during fast movement) divides each frame fragment.
Edited by mdrejhon - 6/17/13 at 6:46am
post #34 of 34
Quote:
Originally Posted by Skylit View Post

About program, there isn't really anything out there.

There is a program that measures IPS tracking speed, but is subjected to inconsistency and limited to a ton of possible variables. Better yet, if one wanted, they could trick the programming.

This is already done to an extent. Current hardware is rather limited in terms of pixel by pixel coverage. One of the major designs (used across 3 companies; 2 exited and transfered/licensed/sold IP) has been updated and in circulation since 2003. DPI was increased by cutting and multiplying the native matrix.**
Fascinating analysis. You certainly know far more about mouse internals than I do, and have a better understanding of how accuracy can degrade. Appears there are direct hardware-based pros and cons to raising the mouse DPI. Obviously, it depends on all sorts of variables (hardware, software, framerate used, VSYNC ON versus VSYNC OFF, capping or uncapping the framerate, etc) but it seems like it's an open question as to which variables dominate and during which situation...

(e.g. which situations where a far lower polling rate 125 Hz -- or maybe 250 Hz -- results in better connected feel than 500 Hz or 1000 Hz)
Edited by mdrejhon - 6/17/13 at 6:53am
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
Overclock.net › Forums › Components › Monitors and Displays › Monitors and FPS