How can a GPU even not work for 10-bit mode for certain programs? If it works and windows recognizes it, your screen recognizes it... how is it that it won't then work in a program that supports it? Isn't it a standard? How can you have HDR work for some things that can use it but not others, on the same hardware? As someone who has been using HDR for a while but doesn't know anything about photoshop, why was this even a thing?
Seems to me that it has just been artificially blocked at a driver level and Nvidia simply removed the block just before AMD did. Am I missing something? This is not an added feature but a literally a removed block, correct? If that is the case then the tone of this announcement is annoying as a company doesn't deserve praise for unblocking a feature that was always there.
250GB nvme + 500GB SSD + 4TB HDD
Samsung 4K 65 inch TV
Pixio PX276 27inch 144Hz 1ms 1440p
▲ hide details ▲