Overclock.net banner
41 - 60 of 83 Posts

· Banned
Joined
·
5,701 Posts
Discussion Starter · #41 · (Edited)
Yes, i didn't see anyone mentioning the game/fsr support part on twitter. Has any other reviewers done fsr 2.0 and multiple games at least? Would probably be nice to see a wider data set to compare the two upscalers.

The above screenshots aren't viewable or valid links for me
When you click on the links you have to wait about a minute or 2 to load as you will see an empty website. It's an active pic that allows you scroll from left to right comparing the 2.
 

· Banned
Joined
·
5,701 Posts
Discussion Starter · #42 ·
Updated Original Post to include edits. Also added Original Vs AA/Sharpening/DoF/TAA/Vignette Off Pics for comparison to show how much TAA/DoF can make the game look blurry/ghostly.
 

· Expert pin bender
Joined
·
3,056 Posts
In your opinion, if items are corrected even being an unsupported game, is DLSS better in this instance?

That seems to be a factor of why most are glossing over this. It's like they are arguing even if said issues were corrected DLSS is still going to win so why bother.

In my mind, even though I hate upscaling for high end. I think this is a disservice to gamers who use these features. It's shoddy work, no matter who's "better".

I hope they will do a new video on games that actually use and support both. This kind of stuff makes you think someone being paid to do this type of reviews.
 

· Banned
Joined
·
5,701 Posts
Discussion Starter · #44 · (Edited)
In your opinion, if items are corrected even being an unsupported game, is DLSS better in this instance?

That seems to be a factor of why most are glossing over this. It's like they are arguing even if said issues were corrected DLSS is still going to win so why bother.

In my mind, even though I hate upscaling for high end. I think this is a disservice to gamers who use these features. It's shoddy work, no matter who's "better".

I hope they will do a new video on games that actually use and support both. This kind of stuff makes you think someone being paid to do this type of reviews.
Take a look at what mods can do to the game below. This, IMO, is the real winner. The removal of TAA and DoF. Although I would prefer a more grainual control of TAA.

Those who think DLSS is better will say that even if they compare it with native. You are not going to get an objective opinion from cheer-leading, shilled, fanboys who only see the world through a green lenses pair of glasses.

The problems is that they are not discussing the flaws shown with DLSS. The blurriness, the IQ imperfections when you zoom in, etc. The truth is both are upscalers. And they are designed for those who don't have mid to high end video cards. In this case GoW only needs 60 FPS. Anything more will not make the game smoother or faster. Having played it myself and getting well over 100 FPS I found no need to use FSR 2.0.

But to answer your question in a different light Jetpack Interactive should have left FSR 1.0 in place. Why they removed it may be of a technical challenge since they jerry rigged FSR 2.0. As there is really no other reason I can think of why they removed it. But you are right, they rushed it and it's a really crappy inclusion.

Therefore, not only do we need the items corrected. But have them use AMD GPU. And, the option to disable/reduce TAA and DoF. As most of what DF found is in direct connection to TAA/DoF IMO.

Upscalers are a simple tool for those who want to enjoy a game without the need to upgrade in the short term. It's a great tool for that. But IMO, not much else can be gained from it. Therefore, anyone saying that DLSS is better is in my opinion an obvious bot-shill trying to hype up a gpu feature so that you can view it favorably.

Below are some pics

Original



TAA/Sharpening/DOF OFF
 

· Expert pin bender
Joined
·
3,056 Posts
AMD Launches FSR 2.0 Plugin for Unreal Engine 4 & 5

I guess we will see more maybe and more testing?

I can see how being a developer sucks, in the dept of hardware and pushing "brand tech features" from the duopoly.

I don't fault them if they choose one over the other and don't bother with the others. It seems to me the GPU makers could simply put these features on the hardware and have a programmable interface they could update to get it working with games. Perhaps that's asking too much considering the bugs that would probably follow.

Heh... Just make fake resolutions like CPUs had fake speed ratings I suppose....
 

· Registered
Joined
·
2,533 Posts
AMD Launches FSR 2.0 Plugin for Unreal Engine 4 & 5

I guess we will see more maybe and more testing?

I can see how being a developer sucks, in the dept of hardware and pushing "brand tech features" from the duopoly.

I don't fault them if they choose one over the other and don't bother with the others. It seems to me the GPU makers could simply put these features on the hardware and have a programmable interface they could update to get it working with games. Perhaps that's asking too much considering the bugs that would probably follow.

Heh... Just make fake resolutions like CPUs had fake speed ratings I suppose....
That shouldn't be an issue here. Nvidia put out an open source API that prepares your game for temporal upscaling, and can then be fed into DLSS, FSR 2.0, or XeSS. Intel signed on to it as well. It's called "Nvidia Streamline." Do the work once to enable API, then can enable all 3 technologies. Most developers end up taking $$ from companies to add/not add features. Like Ubisoft adding DLSS only in Watch Dogs: Legion, and FSR only in Far Cry 6. That behavior needs to stop.
 

· Banned
Joined
·
5,701 Posts
Discussion Starter · #48 · (Edited)
That shouldn't be an issue here. Nvidia put out an open source API that prepares your game for temporal upscaling, and can then be fed into DLSS, FSR 2.0, or XeSS. Intel signed on to it as well. It's called "Nvidia Streamline." Do the work once to enable API, then can enable all 3 technologies. Most developers end up taking $$ from companies to add/not add features. Like Ubisoft adding DLSS only in Watch Dogs: Legion, and FSR only in Far Cry 6. That behavior needs to stop.
That's hardly of any value when it's tied in to a certain generation of GPUs. Being able to do that when "any" gpu is more of interest then how a certain generation of gpu does it. I can understand that at times some can go into with such detail they fail to see the bigger picture. Can anyone with a decent gpu now use it? That's were FSR/2.0 comes into play.

Funny how modding communities didn't make the newer versions of DLSS available on Cyberpunk before it was officially updated as "news worthy". Unlike with FSR 2.0. Leaving me to believe it's not as "friendly" as you make it out to be. And will still stand by my belief that DLSS is a black-box solution. :whistle:
 

· Expert pin bender
Joined
·
3,056 Posts
Call me skeptical given Nvidia's history of black box stuff. The absolute best case scenario of anything Nvidia open sources for even other brands will be: Nvidia does it the fastest, all others can do it but with only one leg.

Flashback: intel vs amd cpus

it's a dirty biz, and the mobs have to keep the cash cows safe and fat.


I do agree on the devs taking money to add or not add features needs to stop. However i am just a consumer at the mercy of the dirty market making dirty deals.
 

· Banned
Joined
·
5,701 Posts
Discussion Starter · #50 · (Edited)
Call me skeptical given Nvidia's history of black box stuff. The absolute best case scenario of anything Nvidia open sources for even other brands will be: Nvidia does it the fastest, all others can do it but with only one leg.

Flashback: intel vs amd cpus

it's a dirty biz, and the mobs have to keep the cash cows safe and fat.


I do agree on the devs taking money to add or not add features needs to stop. However i am just a consumer at the mercy of the dirty market making dirty deals.
It's all about the true definition of "open source". Open to anyone who has a GPU. Imagine that you were restricted to using a certain mouse feature like polling rate unless you bought from brand B? As ridicules as that sounds. Some begin to taut how it's not an issue if you use brand B.

Trying to corner the market with a feature that anyone can use has and will continue to fail. Physx has shown us this. Hairworks has shown us this. HBOA has shown us this. Tessellation has shown us this. Ray Tracing has also shown us this. As long as you attempt to maintain a closed ecosystem of standards it will always fail in the long term. It has never worked. You won't convince the most naive of people that "you can only do that with just my newest hardware". Because even naive people have a tendency to stumble upon the truth.

Therefore, it's not really a hard choice for dev's really. The moral high ground is a far more attractive way to attract profit. Supporting a feature, properly, that anyone can use and get a nice performance boost with little IQ differences is a win for the consumer and the developer who wants us to buy their games. To do opposite, limiting your consumer base, is not good business sense. It's, at the very lest, self serving fanboyism that creates confusion. And also creating a division within that organization that cannot stand. Just look at the CD Projekt Red.

How long will you hold you breath before they finally release DLC, finishing the game with the underworld to it? But hey, at least they rushed the game out for just nvidia owners where CB ran so well. That really isn't DLC they simply screwed up and didn't have time to incorporate it into the game properly. :unsure:

So yeah, it's a dirty business. But most don't have to be in it. Just like there is local waste dump in every town. You know it's there. You see it all the time but it's not a tourist attraction either.
 

· Registered
Joined
·
2,533 Posts
That's hardly of any value when it's tied in to a certain generation of GPUs. Being able to do that when "any" gpu is more of interest then how a certain generation of gpu does it. I can understand that at times some can go into with such detail they fail to see the bigger picture. Can anyone with a decent gpu now use it? That's were FSR/2.0 comes into play.

Funny how modding communities didn't make the newer versions of DLSS available on Cyberpunk before it was officially updated as "news worthy". Unlike with FSR 2.0. Leaving me to believe it's not as "friendly" as you make it out to be. And will still stand by my belief that DLSS is a black-box solution. :whistle:
What on earth are you going on about? Nvidia Streamline is open source with the MIT license. Meaning you can modify it in any way you want and create your own fork. But even without that, it’s designed to let you implement any of the temporal upscaling solutions. They’re not giving out DLSS. They’ve just created a fully open source api available on github that developers can use to do the work of implementing one solution, but enabling all of them. That means FSR 2.0 or XeSS as well as DLSS. It has nothing to do with liking or not liking DLSS itself.
 

· Banned
Joined
·
5,701 Posts
Discussion Starter · #52 · (Edited)
What on earth are you going on about? Nvidia Streamline is open source with the MIT license. Meaning you can modify it in any way you want and create your own fork. But even without that, it’s designed to let you implement any of the temporal upscaling solutions. They’re not giving out DLSS. They’ve just created a fully open source api available on github that developers can use to do the work of implementing one solution, but enabling all of them. That means FSR 2.0 or XeSS as well as DLSS. It has nothing to do with liking or not liking DLSS itself.
Font Screenshot Multimedia Parallel Display device


Besides, stop trying to gaslight me. Your advertisement has nothing to do with the topic at hand. And offers nothing of value to this discussion.

One of the main points of discussion is why did Alex not disclose that he knew that FSR 2.0 was not compatible with direct x11 in God of war.
 

· Expert pin bender
Joined
·
3,056 Posts
You know I wish I could be naive on nvidias willingness to "help" with open source gadgets to make something more widely used. In reality I know better. Nvidia only does something to make Nvidia #1 and even if it means pretending to throw a bone to other competition.

Nvidia streamline is designed to help Nvidia get their own tech more widely implemented. DLSS is not going to survive long term if they can't get it adopted into way more games. DLSS requires tensor cores, which are only in the last two gens of Nvidia GPUs. So that leaves a whole lot of people without it. So they need a way to get DLSS widely implemented on software level so they can keep making more expensive tensor cores GPUs to sell to people. Of course that's assuming they really need the tensor cores long term, maybe they could just switch hardware or make it no longer needed.(makes me think of SLI certification that was easily hacked and could be used on any main board that wasn't "SLI" certified)

Either way i see a , Intel compiler CPU check vendor ID moment coming in Nvidia streamline. Of course right now Nvidia is the only GPU offered with deep learning/AI/tensor cores. So thier solution is "the best" according to them. They have an "upper hand" and so I can see how they magically will be faster and better than those other brands. Of course Nvidia fans will go the traditional route of DLSS is superior and they will be justified in their minds.

It has everything to do with liking or not liking DLSS because Nvidia will be the key holder on this API. Streamline is a cloak and dagger strategy for Nvidia. Intel doesn't even matter in the GPU market, so they will desperately adopt it. Where is their junky GPUs at? Nowhere to be seen. So yeah woopie on Intel being onboard.


https://www.tomshardware.com/news/n...fy-developer-support-for-upscaling-algorithms

"The above open-sourced framework sounds like an appealing solution for the games development industry. If it gains mass adoption it might prove a significant gain for Nvidia, which will almost certainly get DLSS hooked into all the games that use this framework. Nvidia has quite an established lead with image upscaling tech adoption at this time, and Streamline isn't going to hurt it. In some ways Streamline could make sure DLSS will never be usurped by rival tech, especially more open ones which may have bloomed separately later in 2022."


Edit: and yes I am being heavily biased on this. I will gladly change my mind and eat crow if proven wrong. Chances are looking good I won't need to worry.
 

· Banned
Joined
·
5,701 Posts
Discussion Starter · #56 ·
The funny thing is that it is recently rumored that FSR 3.0 is going to utilize machine learning hardware, which could potentially limit it to only certain GPUs.
I am pretty sure they have a plan if that were true. AMD is not like their competitor. If that were true they wouldn't just abandon FSR 2.0.
 

· Registered
Joined
·
464 Posts
I am pretty sure they have a plan if that were true. AMD is not like their competitor. If that were true they wouldn't just abandon FSR 2.0.
I don't believe that TCL leak at all lol.
 

· Premium Member
Joined
·
6,938 Posts
It's a rumor to match your rumor. So we all can have rumors.
LOL.
His "rumor" comes from the fact that yesterday AMD pushed out a new commit to a linux compiler repository showing that there will be a new hardware instruction in upcoming GPUs that does the same thing as a tensor core. The only rumor is that these new hardware blocks in the GPU will be used to accelerate some FSR 3.0 version implementation, it is not a rumor at all that the GPUs will have the tensor core type hardware since AMD did the commit themselves.

Your reply post linking a rumor thread from TCL has nothing at all to do with anything being discussed here in this thread, and is both not a match for JonRock's posted information nor is helpful in any way to this thread.
 

· Registered
Joined
·
2,533 Posts
sigh so this is the Reddit section of OCN filled with a lot of fanboyism and no actual logical discussion or openness to learn anything new. I wish you all good luck and am unsubscribing from this thread because I’m literally losing brain cells reading some of the statements made here.
 
41 - 60 of 83 Posts
Top