Overclock.net banner

1 - 15 of 15 Posts

·
Registered
Joined
·
607 Posts
Discussion Starter · #1 ·
In sum;

- Just bought a 2nd ASUS VS238H-P monitor
- Original one uses DVI-D to connect
- Sold out of DVI cables at work, so grabbed a DVI>HDMI instead
- Plugged into my 670 and the pictures look completely different, like night and day (even after reseting monitor settings)
- DVI looks much better to me
- Verified it was the HDMI cable by trying an HDMI>HDMI instead (still crap picture) and then a DVI>DVI cable (fixed everything)

Craziest part of this? While playing LoL on the monitor using DVI, consistent 60fps. Same monitor using HDMI, ~30fps. Same test conditions

What gives? I know my 670 can handle dual monitors no problem, and I only notice problems using HDMI (which I again verified by using HDMI on the original monitor, giving same ****ty quality). I will be buying a DVI cable tomorrow to alleviate my problem, but now I am curious as to why this is happening. So any thoughts OCN?
 

·
Registered
Joined
·
1,663 Posts
probably the cable is of bad quality or doesn't have enough bandwidth

you should try another hdmi cable from monoprice, amazing cable quality for low price
 

·
Network Architect
Joined
·
2,706 Posts
Since HDMI and DVI use IDENTICAL signals, with the exception being audio on HDMI, my only guess would be to check the underscan and overscan, but it almost sounds as if one of your ports is faulty...
 

·
Network Architect
Joined
·
2,706 Posts
Quote:
Originally Posted by Germanian View Post

probably the cable is of bad quality or doesn't have enough bandwidth
A cable that cost 99 cents is just as good as one that costs 200 bucks, its digital, as long as it gets there it should look the same.
 

·
Registered
Joined
·
779 Posts
well, dual-link DVI has more "bandwidth" I think, since you need one to run a 120hz monitor...but I don't think the OP model is 120hz right? so, shoot, I'm not sure...make sure overscan or scaling settings are accurate, I know I have to do that for my HDMI-connected monitor sometimes, but never for my DVI/display port-connected monitors
 

·
Network Architect
Joined
·
2,706 Posts
Quote:
Originally Posted by flash2021 View Post

well, dual-link DVI has more "bandwidth" I think, since you need one to run a 120hz monitor...but I don't think the OP model is 120hz right? so, shoot, I'm not sure...make sure overscan or scaling settings are accurate, I know I have to do that for my HDMI-connected monitor sometimes, but never for my DVI/display port-connected monitors
Yes, dual link DVI does have more bandwidth, but AFAIK HDMI has a very high bandwidth to begin with and since it can easily run your run of the mill 120hz HDTV it shouldnt be an issue.

Pulled from Wikipedia:

"Dual-link DVI

To support display devices requiring higher video bandwidth, there is provision for a dual DVI link. A dual link doubles the number of TMDS pairs, effectively doubling video bandwidth at a given pixel clock frequency."

Version 1.4 HDMI has a typical bandwidth of 10.2Gb/s if you include overhead...

And yes, HDMI devices have a tendency to not scale correctly without adjusting
 

·
Registered
Joined
·
607 Posts
Discussion Starter · #7 ·
Monitor can only support 60hz so no problem there. Cable was a monster hdmi cable (no I would never pay retail, best buy discount) so not lacking quality. Doubt its the port cuz I use the dvi out on my 670 and plug in thru hdmi. Tried reversing it and still same problems. Ill check the overscan when I get home but doubt that would mess up the color pallet as well
 

·
Registered
Joined
·
1,018 Posts
The first thought I get when reading something like this happening on a Nvidia card is the sad default RGB setting in the Nvidia control panel. By default, Nvidia uses the limited RGB setting suitable for most TV's with their default settings. When you connect it to a monitor in stead, the result is a completely washed out image with no deep colours and poor contrast. The remedy? Change the color format from RGB to YCbCr444. This is a really common problem and seems like the number one reason why some people complain about the image quality of Nvidia cards nowadays.
 

·
Not new to Overclock.net
Joined
·
78,992 Posts
Quote:
Originally Posted by specopsFI View Post

The first thought I get when reading something like this happening on a Nvidia card is the sad default RGB setting in the Nvidia control panel. By default, Nvidia uses the limited RGB setting suitable for most TV's with their default settings. When you connect it to a monitor in stead, the result is a completely washed out image with no deep colours and poor contrast. The remedy? Change the color format from RGB to YCbCr444. This is a really common problem and seems like the number one reason why some people complain about the image quality of Nvidia cards nowadays.
How do you change to YCbCr444?
 

·
Not new to Overclock.net
Joined
·
78,992 Posts
Quote:
Originally Posted by specopsFI View Post

Edited to the previous post as a screen capture.
smile.gif
This must be a feature of some of the newer drivers. I don't have it and I'm still using 285.62
 

·
Banned
Joined
·
4,324 Posts
It is well known that GFX cards have an issue of running games at 24Hz instead of 60Hz on HDMI. I had the same problem with my old HD 4850. I'm surprised this problem still exists with GTX cards
 

·
Registered
Joined
·
1,018 Posts
Quote:
Originally Posted by TwoCables View Post

This must be a feature of some of the newer drivers. I don't have it and I'm still using 285.62
I've been on and off the Nvidia drivers loop, so can't say for sure. But it's been there for as long as I personally can remember. This of course only applies to HDMI since a pure DVI connection doesn't support anything else than the full RGB format.
 

·
Not new to Overclock.net
Joined
·
78,992 Posts
Quote:
Originally Posted by specopsFI View Post

I've been on and off the Nvidia drivers loop, so can't say for sure. But it's been there for as long as I personally can remember. This of course only applies to HDMI since a pure DVI connection doesn't support anything else than the full RGB format.
Oh, of course. So then this is great advice for HDMI users!
 

·
Registered
Joined
·
607 Posts
Discussion Starter · #15 ·
Quote:
Originally Posted by specopsFI View Post

The first thought I get when reading something like this happening on a Nvidia card is the sad default RGB setting in the Nvidia control panel. By default, Nvidia uses the limited RGB setting suitable for most TV's with their default settings. When you connect it to a monitor in stead, the result is a completely washed out image with no deep colours and poor contrast. The remedy? Change the color format from RGB to YCbCr444. This is a really common problem and seems like the number one reason why some people complain about the image quality of Nvidia cards nowadays.
wow thanks, that fixed the color problem instantly. Still getting crappy FPS on the HDMI monitor so I just swapped the cable out for a dvi one I picked up on the way home from class, solved everything and saved me like $3 when I return this other one. Still thanks for the help, that explained the color problem perfectly.
 
1 - 15 of 15 Posts
Top