Originally Posted by EmoPopsicle
Originally Posted by xandypx
I'm not sure where you are seeing a setting for interlaced vs. progressive.
using a DVI>VGA adapter, you have an analog signal, so it's important to make sure that you have configured the monitor itself (in its menu) for the same resolution 1440x900, or sometimes "native" works. Does your monitor not have a digital input?
I'm seeing it in the Nvidia control panel -> screen resolution -> create custom resolution.
The weird thing is, in the Nvidia control panel, it says that the native resolution is 1024x768 when my monitor is telling me that it's 1440x900. Could that pose a problem? When I try 1024x768 the screen is too zoomed in and blurry.
And yeah, my monitor doesn't have a digital input. It's old as hell :[
Yea, the resolution of the signal sent from graphics card should match that of the monitor. if it doesn't, the monitor has to scale the signal to its native to display it, or produce borders around the picture it displays.
Is the monitor a wide screen? If windows is saying 1024x768, it's seeing, and sending a 4:3 aspect ratio that a 1440x900 (16:10 aspect ratio) screen really has to stretch to fit. I can see where you are getting distortion.
have you tried just setting the screens resolution in windows display options, rather than Nvidia control panel? Windows shouldn't balk at setting whatever resolution you want. if you don't see 1440x900, unclick the box to "hide resolutions the monitor can not display".
The other possible solution is... Is there a "driver" available for your monitor? Although usually listed as a driver, it's not really a driver, but rather DDCi information that windows uses to determine display capabilities. With this inf file loaded on your computer, possibly even NVidia won't balk at the resolution you're trying to set.
EDIT: Fixed.. my spelling/typing is atrocious.Edited by xandypx - 1/25/12 at 11:53am