Developers still need to adopt it. That'll likely be the largest hurdle."Compatible with Nvidia GPUs" That's pretty big imo.
The old methods of upscaling and sharpening never left and if it's not a post-process effect (and it's not), it will be something different.So re-launching the old (pre-deep-learning) super sampling as the new competitor to deep-learned super sampling?
Reminds me of Gsync...and we all know how that turned out.Developers still need to adopt it. That'll likely be the largest hurdle.
With NVIDIA's commanding lead in PC gaming graphics marketshare, they are in a much better position to dictate what the bulk of developers implement and if NVIDIA has any say, they will chose the option that makes the competition look worse, no matter what the effect is. If FSR is somehow better looking and faster on NVIDIA parts than DLSS is, NVIDIA will still push devs to use DLSS, because no one else can use DLSS.
G-Sync also had an additional hardware cost that had to be absorbed by display manufacturers and was doomed by a competing standard from a body more able to set such standards.Reminds me of Gsync...and we all know how that turned out.
Brand install base matters more than the specific hardware capabilities of that install base. All those people with their GTX 1050s being counted in Steam surveys are still giving NVIDIA clout that they will use to hamper competition.the vast vast majority of customers don't have rtx cards so Nvidia's marketshare doesn't really apply yet as it's gtx not rtx based
Hopefully that'll be the case. Well, the part about widespread adoption anyway...it would be better if AMD has an equivalent or superior solution, so RT is worth using on their products sooner, rather than later. If the solution isn't appreciably better than extant non-DLSS upscaling methods, it's not going to do much for PC gamers no matter how broadly it's adopted.Given that current generation consoles are on AMD silicon I think even if AMDs version is worse, so long as the performance is good and there is a benefit over traditional AA it will see widespread adoption.
Never came close to implying they didn't.ie hey know Steam survey 1060 users and 1050ti users do not have rtx capable hardware
This too.Now that doesn't mean they won't focus on rtx as Nvidia will pay them and give them dev support to do it.
Not for 5nm and its derivatives, but for a GAA/Nanosheet 3nm/2nm EUV GDDR7(DDR5 based)/HBM 3 or 4 multichip, magic glue monstrosity, who knows1060 and 1050 ti are completely irrelevant to FSR/DLSS discussion. The RX 5300 is not going to be some magically 1080p/300hz beast just like the 1050 isn't going to be regardless of the FSR or if the 1050 was supporting DLSS or not.
things will trickle down but it is not going to make the same poor GPUs today become mid range cards tomorrow.Not for 5nm and its derivatives, but for a GAA/Nanosheet 3nm/2nm EUV GDDR7(DDR5 based)/HBM 3 or 4 multichip, magic glue monstrosity, who knowsMaybe just maybe some of it trickles down to $150-$250 price point, faster than 229 Incident days xD
If it doesn't blur or smear in motion and has no artefacts, it's a win.Hopefully it's more than just temporal upsampling with some sharpening sprinkled on.
Not exactly true, keeping 95% of the settings at low, does make a card jump up a tier. And that's the whole point for competitive shooters/mobas, and battle royale and similar long view distance shooters where u really need some aliasing to cover horrible jaggies at long in game ranges. The reduced latency is just the cherry on the cake.Same reason why DLSS isn't making the RTX 3060 become as fast as a 3060 TI or 3070. DLSS (and FSR) are just going to give a little extra push to help them go against nvidia's equivalents.