Originally Posted by mothergoose729
Interesting. And scary, from a market competition stand point. This is the first nvidia exclusive feature that has a real chance of making all AMD hardware seem obsolete. How can you compete with better visuals at no loss of performance on any skew?
Still, this is a marketing demo. I want to see DLSS in more games with side-by-side comparisons, and by independent reviewers.
Right now it still depends on developer adoption. AMD has a few years yet to create their response.
DLSS also automatically comes with the added baggage of RTX. If we see DLSS in a top of the line GTX card? That would have no competition.
For now RTX is probably enough of an anchor to keep the competition close.
It'll be quite a while before developers are anywhere near to universally adopting DLSS, AMD has some time, and there are still downsides to DLSS.
AI still can't make things appear that haven't been hit by the raster engine, as is currently the case, DLSS will always work better the higher your Native Resolution. If we were talking about 8K DLSS instead of 4K then the results of DLSS would be much closer to the native image.
For now if I were to use DLSS I would just use it to Anti Alias a native 4K image with the DLSS 2X setting (that has yet to be implemented in any games). 4K120hz is ideal for now but Native 4K is still full of jaggies and whatnot that would be nice to have filtered away.
I just want good AA on top of that at little to no cost.
I'll also have to see how bad the DLSS input lag penalty is. I know TAA is generally pretty "yuck" but if people keep working on it you never know how good it can get.
It seems like the silicon area cost of DLSS vs the performance benefit is very good, so it looks likely that this will be "The" Anti Aliasing method of the future, we'll just have to wait and see how long it takes to really push adoption.