Originally Posted by ToTheSun!
In theory, it can never be as good because the ground truth is not of any game you might be playing. It should be fine, though, for a lot of other content. In any case, it's hard to imagine the small chip inside TV's being as good as nVidia's tensor cores at their intended usage.
Well there are quite a few high quality upscalers it's only games and GPU drivers nor monitors that offer them. Where as with TVs they are more competitive and do offer these sort of image altering algorithms that don't often have much use on monitors except "gaming" monitors to play 1440p on 4k panel. TVs probably have nowadays fairly sophisticated even AI driven features in their purpose built processors.
A reasonable scaling: Jinc? SuperXBR? NGU? yeah none of these found in games etc. Probably not even Lanczos. Often they either do shader resolution change or if it's a true target resolution change then they use some crappy bicubic scaling that is offered by what ever middleware they use.
DLSS not being beneficial to be used at high FPS is a major drawback.
Oh yes Nvidia throws money/resources/developers at game studios to use their middleware/crapworks and get their logo in a game, optimize the game for their hardware and not competitor, ...
The idea behind DLSS nor tracing is bad but the performance and quality is the way it is being implemented by Nvidia right now. Even Q2VKPT... you may think it's some random dude that made it, but as far as I remember, if you look it up it's a researcher that works with Nvidia.