Aorus AD27QD - 10bit Question - Overclock.net - An Overclocking Community

Forum Jump: 

Aorus AD27QD - 10bit Question

 
Thread Tools
post #1 of 8 (permalink) Old 05-12-2019, 11:34 PM - Thread Starter
New to Overclock.net
 
NEK4TE's Avatar
 
Join Date: Apr 2014
Location: I am Canadian!
Posts: 569
Rep: 11 (Unique: 8)
Aorus AD27QD - 10bit Question

Hello,

Recently purchased this monitor, however, i am confused when it comes to 10bit.
What exactly does this do / how it works? (Is it important)
I did do some Google re-search, however, its still not completely clear to me.

Hopefully, somebody will be able to explain better (thanks!)

Also, in Windows 10, AMD / RADEON Settings, when Monitor is set 60Hz (In Windows), i can see Color Depth set to 10bpc ( i can even change it to 12 ), however, if i change Monitor to 144Hz (In Windows) , i can only set it to 8bpc.
Is something wrong here, its supposed to be like this?

60Hz (Windows):



144Hz (Windows):


This was changed automatically to 8bpc after i changed refresh rate, and unable to set it back to 10bpc while on 144Hz.

Please advise, thank you!
NEK4TE is offline  
Sponsored Links
Advertisement
 
post #2 of 8 (permalink) Old 05-12-2019, 11:35 PM - Thread Starter
New to Overclock.net
 
NEK4TE's Avatar
 
Join Date: Apr 2014
Location: I am Canadian!
Posts: 569
Rep: 11 (Unique: 8)
Just to add, just found out, even if i set Windows 10 to 120Hz, i am able to set it to 10bpc.



Please advise, thank you!
NEK4TE is offline  
post #3 of 8 (permalink) Old 05-13-2019, 10:14 AM
New to Overclock.net
 
QuantumPion's Avatar
 
Join Date: Jun 2013
Posts: 90
Rep: 3 (Unique: 2)
From what I understand, 10bit color basically means HDR color mode. HDR color is only supported up to 120 hz by display port standards, or something like that.
QuantumPion is offline  
Sponsored Links
Advertisement
 
post #4 of 8 (permalink) Old 05-13-2019, 04:29 PM - Thread Starter
New to Overclock.net
 
NEK4TE's Avatar
 
Join Date: Apr 2014
Location: I am Canadian!
Posts: 569
Rep: 11 (Unique: 8)
Thanks for responding.

This is confusing for me as well, so, in case a game is played with 144Hz (and if game supports HDR), it cannot run in 10bit color mode? = false advertising, or i am confused 200% ?

Thanks
NEK4TE is offline  
post #5 of 8 (permalink) Old 05-14-2019, 07:02 AM
Overclocker
 
JackCY's Avatar
 
Join Date: Jun 2014
Posts: 9,278
Rep: 309 (Unique: 225)
Quote: Originally Posted by QuantumPion View Post
From what I understand, 10bit color basically means HDR color mode. HDR color is only supported up to 120 hz by display port standards, or something like that.
LOL no, completely unrelated.

Quote: Originally Posted by NEK4TE View Post
Thanks for responding.

This is confusing for me as well, so, in case a game is played with 144Hz (and if game supports HDR), it cannot run in 10bit color mode? = false advertising, or i am confused 200% ?

Thanks
8bit = 256 per color.
10bit = 1024 per color.

10bit uses more bandwidth, only certain applications support 10bit output and unless you have professional GPU driver with OpenGL-10bit support etc. you're not gonna get say Photoshop to show 10bit. You need the Quadro/FirePro driver for that. On consumer cards you can get 10bit, or should be able to, via 3D APIs, such as some video players, games if they would care to support 10bit.

1440p 144Hz 8bit = 14.08Gb/s
1440p 144Hz 10bit = 17.60Gb/s (25% more)

If it doesn't have DP 1.3+ it will be limited to 17.28Gb/s and may not allow you or offer you in EDID by default a 10b 1440p 144Hz.

HDR mode is again different, by default uses 10b which means most HDR advertised monitors never reach max refresh in HDR mode at least not without sacrificing chroma quality. Some are fine but not all. Even in SDR mode some monitors especially Gsync ones have to sacrifice chroma quality to achieve max advertised refresh rate, more of an issue with 4k monitors though.

I don't think they are false advertising, you just need to read better and check what you're buying.
JackCY is offline  
post #6 of 8 (permalink) Old 05-14-2019, 08:36 PM - Thread Starter
New to Overclock.net
 
NEK4TE's Avatar
 
Join Date: Apr 2014
Location: I am Canadian!
Posts: 569
Rep: 11 (Unique: 8)
Thank you for your time and providing this useful information.

This clears it all up.

Thanks again!
NEK4TE is offline  
post #7 of 8 (permalink) Old 05-21-2019, 05:46 AM
New to Overclock.net
 
QuantumPion's Avatar
 
Join Date: Jun 2013
Posts: 90
Rep: 3 (Unique: 2)
Quote: Originally Posted by JackCY View Post
LOL no, completely unrelated.



8bit = 256 per color.
10bit = 1024 per color.

10bit uses more bandwidth, only certain applications support 10bit output and unless you have professional GPU driver with OpenGL-10bit support etc. you're not gonna get say Photoshop to show 10bit. You need the Quadro/FirePro driver for that. On consumer cards you can get 10bit, or should be able to, via 3D APIs, such as some video players, games if they would care to support 10bit.

1440p 144Hz 8bit = 14.08Gb/s
1440p 144Hz 10bit = 17.60Gb/s (25% more)

If it doesn't have DP 1.3+ it will be limited to 17.28Gb/s and may not allow you or offer you in EDID by default a 10b 1440p 144Hz.

HDR mode is again different, by default uses 10b which means most HDR advertised monitors never reach max refresh in HDR mode at least not without sacrificing chroma quality. Some are fine but not all. Even in SDR mode some monitors especially Gsync ones have to sacrifice chroma quality to achieve max advertised refresh rate, more of an issue with 4k monitors though.

I don't think they are false advertising, you just need to read better and check what you're buying.
Can HDR mode run in 8-bit color? How does it compare to HDR mode in 10-bit color with 4:2:2 chroma compression?

Also I think my original point is still mostly correct although I phrased it wrong. 10-bit color mode's max refresh is limited by the DP standard used, and that 10-bit color is used for HDR color mode.
QuantumPion is offline  
post #8 of 8 (permalink) Old 05-30-2019, 04:34 PM
Overclocker
 
JackCY's Avatar
 
Join Date: Jun 2014
Posts: 9,278
Rep: 309 (Unique: 225)
I think HDR standard specifies 10bit minimum. Technically you can do HDR in any bit depth (and suffer from even more banding below 10bit) but what you can buy now is standardized HDR 10bit mostly.

https://en.wikipedia.org/wiki/High-d...ideo#Standards

4:2:2 will look bad if you focus on the chroma resolution difference that there is.

Yes.
JackCY is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off