post #1 of 1
Thread Starter 
So, concerning input lag caused by NVIDIA driver settings... - 3 things I wanted to ask about and find out whether there is any consensus regarding them:

1. Scaling options

The option "Display - No Scaling" is said to result in the least amount of input lag possible, but why is that and is this true no matter the type of monitor and connection (VGA/DVI/HDMI) one is using? Or are there monitors with inferior or even without inherent scaling functions? Can letting certain monitors do the scaling take up more time than it would letting the GPU do it? What about using an EDID hack, i. e. removing the pins of the connector responsible for timing and scaling communication?

2. Maximum pre-rendered frames

NVIDIA at some point removed the possibility of setting the pre-render to a maximum of 0 frames, stating that "0" is illogical as at least 1 frame always has to be pre-rendered by the CPU. Yet people noticed an increase in input lag due to having this set to "1". Is it possible that due to 1 frame always having to be pre-rendered for the GPU to even be able to work, the setting in the driver essentially refers to "additional" pre-rendering? As in, setting it to 1 means, in addition to the necessary frame, another one is pre-rendered by the CPU?
Also, are there known registry entries to force this option to "0" again? Or instead, is there any way to find out what the pre-rendering behaviour of specific DirectX games is? Seeing as how you can also use the application's settings, and how some games provide a client with cvars to customize pre-rendering, this could be another way to force the least possible amount. How was pre-rendering handled in OpenGL games?

3. Frame/s limit

Most games offer cvars to cap the framerate at specific values, but not few have shown to produce inconsistencies and frame jumping or stuttering. Would it be more sensible to use the graphic driver's own (via. NVIDIA Inspector for instance) frame limiter to cap the framerate?