Originally Posted by junglechocolate
I thought we already had this in GPUs? How else has HDR in PC monitors been implemented. I also remember 8bpc, 10 and 12 bpc in Windows options with my old 1080 Ti. Why is this news?
Because as stated million times in this thread by now... up until now neither NV nor AMD shipped consumer GPU driver with OpenGL 10bit which is what you need for OpenGL applications (Adobe, ...) to be able to show you 10bit.
DirectX is 10bit only in fullscreen, maybe even exclusive fullscreen only.
OGL 10bit in Photoshop certainly isn't fullscreen and it does work now.
The problem isn't getting data sent to monitor but getting it transferred in above 8bit bit depth from application (memory) to GPU (aka you need DX, OGL, Vulkan etc. unless you like to play it hardcore in 2019 and access GPUs directly somehow).
Since forever they have kept OGL 10bit for Quadro/FirePro drivers only. And everyone complains about it for decades.
It was nothing more than one of the software locks, just like adaptive sync without Gsync module in monitor on NV cards.