Overclock.net banner
Status
Not open for further replies.
21 - 40 of 65 Posts

·
Simpleton
Joined
·
1,896 Posts
This is another image quality cheat.
and Nvidia is better with their cheating technique.

case closed.
 

·
Long Time Lurker
Joined
·
952 Posts
It never had to be better. It had to be comparable enough while being easy to implement and hardware agnostic.

Seems like they did a decent job to me even in the earliest implementations. Not sure what some of you were expecting.
 

·
Joined
·
2,520 Posts
It looks mainly useful at 4K, which means everything from a GTX 1060 up is going to benefit massively.
 

·
Linux Lobbyist
Joined
·
501 Posts
Whether something is cheating or not is irrelevant if it's good enough to fool our eyes in motion. Fooled Eyes+more FPS is a "victory" in every conceivable aspect of that word.
 

·
Tank destroyer and a god
Joined
·
2,744 Posts
Whether something is cheating or not is irrelevant if it's good enough to fool our eyes in motion. Fooled Eyes+more FPS is a "victory" in every conceivable aspect of that word.
Well replacing genuine 4k with 1080p+additional computing... Depends how many extra FPS per Watt, however...

I found out that I can safely turn off Antialiasing on all titles which run at 4k 60+FPS, meaning the quality of real 4K allows me to turn off feature which is definitely needed at 1080p. We are talking about 30% of total GPU performance in this case.

Lets take some older title (lets say Skyrim Legendary edition), run it on 1080p with AA, and measure GPU power consumtion. Then the same with DLSS. Then again on 4k without Antialiasing. I am really curious about which will be preferred by the user, and with lowest power drain. The game is capped at 60FPS so "free resources" will reflect on lower power consumption.
 

·
Overclocker
Joined
·
4,581 Posts
At least, both GN and HUB did their homework and respected the marketing materials, reviewing rules, that AMD sent and imposed them.

The pixelized thumbnail is showcased in all the FSR videos, that followed the AMD marketing rules.
So much originality!

So it did not surprise me, (i do not watch their content) that GN give a pass to AMD FSR.
Both GN and HUB give a pass on whatever marketing materials is sent from AMD.
Ah yes, reviewers have united to form the secret society of "marketing material" because they are conspiring to give a video analysis of their findings. And provide an independent review by showing you how good FSR is. By telling you almost anyone can use it. FSR isn't locked down on a particular hard to buy, overpriced gpu.

FSR, can be used on existing gpus. And the fact that you don't need tensor cores to "upscale" in a covaluted, cumbersome, over the top way. That really shows why tensor cores was shoe horned for pc gaming gpus. And, how unnecessary they are in gaming. LOL.

Let's all ignore that the IQ in Ultra and Quality are exceptional at higher resolutions for the performance uplift from this "upscaler". That even, forgotten about, pascal users can use. Let's concentrate on how you feel about it. And how you claim you don't believe anything positive about FSR do to some random conspiracy theory how they are out to get you!!!

So do use all a favor don't let them get to you. Keep the hate alive. Don't let honest reviews convert you. And hold fast to the belief that upscalers should only be tied into 1 parricular set of gpus. 1st Tensor 6:12-14.
😂🤣
 

·
Vandelay Industries
Joined
·
2,501 Posts
I'm going to say FSR is BETTER than DLSS for the vast majority of consumers atm. Why?

1) More consumers can use it. DLSS is limited to RTX cards
2) RTX cards don't really need upscaling to anything but 4k and only 2060-3070 cards. 3080 and above can do 4k native just fine

What I don't like is that FSR isn't really good for 1080p, but either is dlss (due to the only cards that can use it can do 1080 just fine without it), so that a wash.

DLSS is BETTER than FSR for 2060-3070 users and that's about it. Does DlSS look better? Yes, but it is hindered massively by it's limited hardware scope.

Note* I am omitting the number of games that use either and I am omitting future generations of gpu's
 
Joined
·
1,173 Posts
Looks really weak to me. It introduces a lot of aliasing and shimmer, not much better than a simple resolution scale slider that nobody ever used because even relatively small adjustment like 0.8x ruins the image, at least if you are paying attention and not sitting 5 meters away from mid-sized TV in bright room scrolling social media on your phone instead of looking at the game. This is just nothing new and nowhere near DLSS, which does have some problems especially in motion but as far as anti-aliasing goes it rivals the best AA techniques to date but doubles performance instead of cutting it in half.

Realistically it was to be expected but still it is disappointing because there are games that are partnering with AMD and some of them will certainly be good so it would be great for them to have any features so they don't run 2 times slower and with worse image quality at that, but alas...
 

·
Banned
Joined
·
8,590 Posts
Discussion Starter · #30 · (Edited)
I'm going to say FSR is BETTER than DLSS for the vast majority of consumers atm. Why?

1) More consumers can use it. DLSS is limited to RTX cards
2) RTX cards don't really need upscaling to anything but 4k and only 2060-3070 cards. 3080 and above can do 4k native just fine

What I don't like is that FSR isn't really good for 1080p, but either is dlss (due to the only cards that can use it can do 1080 just fine without it), so that a wash.

DLSS is BETTER than FSR for 2060-3070 users and that's about it. Does DlSS look better? Yes, but it is hindered massively by it's limited hardware scope.

Note* I am omitting the number of games that use either and I am omitting future generations of gpu's
Or game developers can just implement game engine temporal upscalers that look better than FSR. This isn't new, consoles have had it for years and many PC titles already have it. AMD again is just smoothing edges yet again. Looks like another product that looks like trash but has AMD branding is getting a pass.


DLSS is similar stupid brand name. You either do native 4k or you dont and discussion about better or worse upscale is mostly waste of resources. I understand AMD needs its own response, but now its far from being as elegant as FreeSync was.
DLSS explains exactly what their software does. "Deep Learning Super Sampling"

AMD's the one that uses stupid branding: FidelityFX Super Resolution
 

·
Banned
Joined
·
8,590 Posts
Discussion Starter · #32 ·
Not even going to bother watching that conspiracy theorists. Tensor cores aren't going anywhere for gamers as technologies like these are being worked on. FSR won't be the killer of DLSS but I can definitely see TSR or other temporal upscalers killing it.



 

·
Vandelay Industries
Joined
·
2,501 Posts
Well if that's your point..then the same can be said for dlss. If you can't really see the difference it DOESN'T matter what the method is.

Edit...to add on that...If you have to zoom in to see the differences...they don't really matter.
 

·
Banned
Joined
·
8,590 Posts
Discussion Starter · #34 ·
Well if that's your point..then the same can be said for dlss. If you can't really see the difference it DOESN'T matter what the method is.

Edit...to add on that...If you have to zoom in to see the differences...they don't really matter.
Blind test from GN's review, it's obvious how much of an IQ hit comes from FSR without zooming. Again game developers should stick to using a TAA based up scalers. DLSS is a TAA based upscaler but isn't hardware agnostic like many including in game engines.

AMD FidelityFX Super Resolution Quality Comparison & Benchmarks (FSR) - YouTube

2515014
 

·
Vandelay Industries
Joined
·
2,501 Posts
Being non hardware agnostic...is not a good thing. It's actually bad in our context. Maybe you used the word wrong?
 

·
Cheesebumps!!
Joined
·
2,800 Posts
Looks really weak to me.
of all the posts here I find your's trivial..why would you use such technologies on SLI 2080TI's??? 🤣

I mean, I for one who owns a 6800XT won't find it that much useful these days since everything runs pretty fine on it on native resolution..majority of us here have high end systems and I think its kinda dumb to put your shoes on something that doesn't fit you..I would say FSR would work out best for potato PC's and 2-3 generations old GPU's which can still kick some ass on latest games with low settings..this is just FREE performance bump for them..
 

·
I <3 narcissists
Joined
·
7,154 Posts
of all the posts here I find your's trivial..why would you use such technologies on SLI 2080TI's??? 🤣

I mean, I for one who owns a 6800XT won't find it that much useful these days since everything runs pretty fine on it on native resolution..majority of us here have high end systems and I think its kinda dumb to put your shoes on something that doesn't fit you..I would say FSR would work out best for potato PC's and 2-3 generations old GPU's which can still kick some ass on latest games with low settings..this is just FREE performance bump for them..
I doubt there are enough here with a high end AMD system to call it a majority. Those cards are really hard to come by.
 

·
Cheesebumps!!
Joined
·
2,800 Posts
I doubt there are enough here with a high end AMD system to call it a majority. Those cards are really hard to come by.
exactly why I said it would benefit the most on those who held on to older GPU's..as GN mentioned, AMD used it as a propaganda instead of releasing another GPU to the current stack of unobtanium GPU's..

instead of ripping the community, they just made a big dick move be the good guys..and this moving forward, would really help lessen the damage miners did (although its already crumbling nowadays, won't take long for the great GPU flood)
 
21 - 40 of 65 Posts
Status
Not open for further replies.
Top