post #1 of 1
Thread Starter 
I think I fit my entire question into the title, but if not:
When a game or app is set to run in a lower resolution than native, how do I know that DirectX picks the custom, overclocked resolution over the normal one?

I have Vsync disabled everywhere and all that. I have a Qnix 2710 and a GTX 780 and a heavily modded game of skyrim and I wanted to take better advantage of my monitor's overclock because I'm getting sick of the opposite (low FPS) at 1440p. I have Vsync disabled in the *.ini file and I set the resolution there too. I'm using the NVidia control panel to create custom resolutions but those don't replace the standard 60hz resolutions that are there as well. I just don't trust DirectX to pick the overclocked resolution if windows isn't already running it and running windows at a lower resolution is a silly workaround.

I thought scaling from 1440p to 720p would be pretty straight forward and crisp (but blocky, yes) due to it being a literal 4 to one pixel scale down, but it is blurry like any other non-native resolution on any other monitor. I assume non-hardware scaling would destroy my FPS since it is going to be more CPU limited even before the fact, is that right? I should just run 1080p in that case, but my initial question remains, since I intend to do pretty much everything else at 1440p.

edit: The game launcher, where normal people set the reolution, says 720p should be letterboxed and it is not. The .ini file that it modifies gets reset if I do that, making mods not work properly so it is read only, so I opened it up and just typed the resolution in. There is nothing about refresh rate in there. Should it be using the refresh rate I already have set for the native resolution?
Edited by SpoonyJank - 9/21/13 at 12:13pm