Overclock.net banner

61 - 80 of 86 Posts

·
Registered
Joined
·
1,777 Posts
speaking of GodFall, are there any RT benchmarks?
 

·
Graphics Junkie
Joined
·
2,797 Posts
More and more I see ray tracing as being over rated. A massive performance hit just to have reflections that are very slightly better looking than screen space reflections. It just doesn't seem worth it. Even if I had a 3090, I'd probably still choose to turn off RT to get better performance given how minor the visual impact is.
 

·
Back in the Game!
Joined
·
1,009 Posts
I was thinking of going to the 6900XT or 3900 this time around - but I'm going to stick with the 2080 Ti until probably AMD 7000 series next year and just upgrade CPU to 5950X.
 

·
Overclocker
Joined
·
11,684 Posts
More and more I see ray tracing as being over rated. A massive performance hit just to have reflections that are very slightly better looking than screen space reflections. It just doesn't seem worth it. Even if I had a 3090, I'd probably still choose to turn off RT to get better performance given how minor the visual impact is.
Hmm, OK. "very slightly better looking than screen space reflections"

2466334
2466335

2466336
2466337

2466338
2466339

2466340
2466341

2466342
2466343

If you can't figure out which one is which, then yes good graphics are more than wasted on your eyes.

In Control in particular the reflections are a major difference between traditional fakes and traced.

And don't blame me for the butchered resolution and quality of OCN's image uploads it seem now :( I've uploaded 1440p near lossless that fits in the 20MB limit it was angry about. Yet it shows back some 1080p compressed what ever it is supposed to be.
Ah figured it out, if you want to see the uploaded image you have get the image link and delete from it the resolution and downscaler, then it will show you the 1440p 99% quality JPG files that were uploaded or at least 1440p.

OCN's new user interface... bad.
 

·
Registered
Joined
·
186 Posts
RTX looks the beautiful but more I read it crushes fps as much as 50% and for me cannot spare that much. Hoping I'm wrong it is amazing. :)
 

·
Iconoclast
Joined
·
30,615 Posts
could it just be that Nvidia maybe just maybe Nvidia is better than AMD at raytracing at the moment ?
They unquestionably are, and no one is disputing that.

Their parts also had significantly better geometry performance during the time umeng2002 mentions. This performance differential is exactly what enabled the underhanded tactics being refered to. In the past, with tessellation, NVIDIA was willing to push for default settings that far exceeded the point of diminishing returns in IQ, even at the cost of performance on their own hardware, just to make ATI/AMD look proportionally worse. It was an intentional de-optimization that hurt every consumer playing these titles, it just hurt Radeon users a bit more, until AMD introduced a hack to cap excessive tessellation.

The question is now how much ray tracing is needed to produce a meaningful IQ improvement vs. how much is needed to allow NVIDIA's hardware to pull significantly ahead in overall performance. If similar IQ can be achived with less RT than Cyperpunk 2077's presets are requesting, it could well be evidence of a deliberate counter-optimization to make look competitor hardware look bad. This would be bad for everyone, except NVIDIA, because even those on Ampere would be losing some performance for no gain, while those with Turing or RDNA2 parts may lose the ability to utilize meaningful RT at all.

It's not impossible that NVIDIA would try similar exclusionary tactics again, especially since they have other exclusive features that have a level of synergism with RT. All the need to do is look better in initial benchmarks and it will pay off.

At this point, any such accusations are just speculation, and we'll need to see some more hardware agnostic DXR titles, with tunable ray tracing settings, to see where the actual point of diminishing returns is, and how each hardware RT implementation handles them.

I'm not sure I understand what you're saying here. In hybrid RT workloads (Port Royale, Control, Watch Dogs: Legion, Wolfenstein: Youngblood, etc.), the 3080 wins in almost all cases. In FULL RT or Path Traced scenarios (3DMark DirectX Ray Tracing Feature Test), the 3080 is nearly 2x faster than the 6800 XT.
This differential is how targeted sabotaging of performance can work.

NVIDIA knows they have much stronger RT hardware, so the more RT they can mandate a title use, the better they will look, even if they have to sacrifice performance on their own parts for no apparent IQ gain...Ampere will always loose less than RDNA2.

If the IQ gains are apparent and not well past diminishing returns, then it's not sabotage...but if we find out later that we can't tell the difference between the more demanding settings and less demanding ones, we'll know it was.
 

·
Registered
Joined
·
3,471 Posts
Look at how poorly HairWorks™ works even on nVidia's GPUs. Now look at the TressFX AMD implemented. TressFX has like no performance impact.

nVidia develops these middle-ware solutions to benefit their cards. If their GPUs were good at baking cakes, nVidia would pay developers to make their games bake cakes.

nVidia has a habbit of making sub-optimal solutions. One reason is to make AMD's cards look bad. The other HUGE part is to get you to buy a faster nVidia card. The odds of nVidia stopping this sort of thing NOW would be more suspicious than if they kept on doing it.

In the end, it's up to CDPR. I have a feeling, no matter what nVidia says, implementing ray tracing in Cyberpunk is way more involved than slapping HairWorks™ into the Witcher 3. Because of this, CDPR might really optimize their RT approach.
 

·
PC Evangelist
Joined
·
47,374 Posts
Look at how poorly HairWorks™ works even on nVidia's GPUs. Now look at the TressFX AMD implemented. TressFX has like no performance impact.

nVidia develops these middle-ware solutions to benefit their cards. If their GPUs were good at baking cakes, nVidia would pay developers to make their games bake cakes.

nVidia has a habbit of making sub-optimal solutions. One reason is to make AMD's cards look bad. The other HUGE part is to get you to buy a faster nVidia card. The odds of nVidia stopping this sort of thing NOW would be more suspicious than if they kept on doing it.

In the end, it's up to CDPR. I have a feeling, no matter what nVidia says, implementing ray tracing in Cyberpunk is way more involved than slapping HairWorks™ into the Witcher 3. Because of this, CDPR might really optimize their RT approach.
Nvidia has had extra 6-8 months for RT with CP2077.
 

·
I <3 narcissists
Joined
·
6,779 Posts
RTX is just a brand name, it still uses DxR. Only proprietary ray trace renderer Nvidia has is OptiX and that's mainly used for professional applications.

There's no reason DxR wouldn't work on cards that support DxR. Only reason I could see causing the delay is that they're waiting for AMD's enhanced DxR libraries to make use of the 6000 series dedicated RT hardware.
Well if that's the case then I hope CDPR gets taken to the woodshed like the godfall dev.

The CDPR fanboyism is probably too strong for that to happen.
 
  • Rep+
Reactions: Sir Beregond

·
WaterCooler
Joined
·
4,019 Posts
Well if that's the case then I hope CDPR gets taken to the woodshed like the godfall dev.

The CDPR fanboyism is probably too strong for that to happen.
For all we know, they could have some exclusivity deal with Nvidia for a short period. Who knows.

All I know is it sucks for consumers and such activity like this from the Godfall dev and possibly CDPR is gross.
 

·
Performance is the bible
Joined
·
7,194 Posts
Look at how poorly HairWorks™ works even on nVidia's GPUs. Now look at the TressFX AMD implemented. TressFX has like no performance impact.
That is a silly comparison, and a false one at that.

Hairworks runs on everything in a scene. From animals to NPCs to the character you play. That is why there is a bigger impact, because there is more to do.
With witcher 3, initial performance hit with hairworks was huge, but after 2 months and a big patch, it had very small relative performance hit (15%).

TressFX works only on the main character you play, and only on that character. And even then it took them ages to make the hair stop flapping about for no apparent reason.
And above that, if you actually look at performance reviews, tressfx had 30%+ performance hit for several years. Later they had a more moderate performance hit (10%) when they released their latest variation (which basically no one used even after they gave it all for free and tried to intergrate it to unreal engine), and that was because they toned it down by a lot.
Tressfx was a huge failure that AMD used to claim it was "open source" which in truth it was a close source for 3 full versions until they realized no one was using it, so they opened as they lose the fight with gameworks.
 

·
sudo apt install sl
Joined
·
8,328 Posts
That is a silly comparison, and a false one at that.

Hairworks runs on everything in a scene. From animals to NPCs to the character you play. That is why there is a bigger impact, because there is more to do.
With witcher 3, initial performance hit with hairworks was huge, but after 2 months and a big patch, it had very small relative performance hit (15%).

TressFX works only on the main character you play, and only on that character. And even then it took them ages to make the hair stop flapping about for no apparent reason.
And above that, if you actually look at performance reviews, tressfx had 30%+ performance hit for several years. Later they had a more moderate performance hit (10%) when they released their latest variation (which basically no one used even after they gave it all for free and tried to intergrate it to unreal engine), and that was because they toned it down by a lot.
Tressfx was a huge failure that AMD used to claim it was "open source" which in truth it was a close source for 3 full versions until they realized no one was using it, so they opened as they lose the fight with gameworks.
Just want to point out that Crystal Dynamics based their Pure Hair off of the open source TressFx.

A few more months and an AI will replace it.

 

·
Performance is the bible
Joined
·
7,194 Posts
Just want to point out that Crystal Dynamics based their Pure Hair off of the open source TressFx.
Hmm what are you smoking?

Tressfx was developed by AMD and crystal dynamics back in 2013 (well, mostly by crystal dynamics) for tomb raider. It had a much more performance hit on nvidia than AMD of course, and it even caused crash on nvidia cards, until crystal dynamics put out a patch to fix it (took a few months). People used to talk about how nvidia "sabotaged" amd in witcher 3, but forget about this little fact that AMD wanted crystal dynamics to not work with nvidia about fixing tressfx for nvidia.
AMD claimed tressfx is "open source" since day one. It never released the source code until late 2016 with version 3.1.
Crystal dynamics has been involved in tressfx development since the start. I would dare to say that AMD just put their name on it for the most part, and crystal dynamics did the actual work (which would explain why only 2 games ever came out with tressfx (both using CDC engine by crystal dynamics).
Pure hair is a rebranded alternated version of tressfx 3.0 by crystal dynamics for rise of the tomb raider (remember, 2015). It was not based on open source, as tressfx was still not released as open source (yet, remember, 2016), and crystal dynamics actually developed tressfx, so...
 

·
Graphics Junkie
Joined
·
2,797 Posts
Hmm, OK. "very slightly better looking than screen space reflections"


If you can't figure out which one is which, then yes good graphics are more than wasted on your eyes.

In Control in particular the reflections are a major difference between traditional fakes and traced.

And don't blame me for the butchered resolution and quality of OCN's image uploads it seem now :( I've uploaded 1440p near lossless that fits in the 20MB limit it was angry about. Yet it shows back some 1080p compressed what ever it is supposed to be.
Ah figured it out, if you want to see the uploaded image you have get the image link and delete from it the resolution and downscaler, then it will show you the 1440p 99% quality JPG files that were uploaded or at least 1440p.

OCN's new user interface... bad.
It actually took me a little while to realize that the pics you posted were not duplicates. Yes the difference is there and anyone can tell which is which when looking for the differences but in general I don't see it as a very big difference. Don't get me wrong, if it were a lesser performance hit I would for sure want to use it myself but I don't see the performance hit as being worth it generally speaking for most of the games out with RT right now. Real life is never nearly as full of reflections as these newer games are, and even when life does give you reflections they are almost never very clear or noticeable. Reflections in real life are usually more like theses:

2467219
2467220
2467221
2467222


In RTX games reflections look overly mirror like, probably because the rougher a reflection is the higher the performance hit so RTX games tend to all look like they recently had the floors waxed or like it just rained and there are puddles everywhere but also the sun is out to help create nice reflections. If you look at Control, that game perhaps more than any other has floors way more shiny than any real industrial area would but in real life people don't encounter a lot of super shiny floors and especially not in places like the setting for Control.

I think Watch Dogs legion is a good example of how little impact RT reflections can really make. It seems like a lot of games (BF5 especially) intentionally nerfed the visuals with RTX turned off in ways that go beyond just RT. BF5 straight up has missing lights and shadows in some places without RT so it's not a good on vs off comparison. I think we can all agree that as far as RT reflections go, when RT is not being used of a reflection surface screen space reflections should be used as a cheaper alternative and that's exactly what happens in WD:Legion. That game didn't go to any extra effort to make RTX stand out more than it should and in the end, it really doesnt stand out much at all. Without RT, the usage of cube maps and screen space reflections works well enough to make it look like everything that should be reflective is believably reflective to about the same degree that RTX does.

2467223


I don't think it's hard to make the case that the RTX off in this example actually looks better. The softer reflection on the water arguably looks more believable than the mirror like on on the right. Additionally the left image actually has a few more reflections than the right image, probably because the roughness of a surface isn't a performance issue with screen space reflections like it is with RT so the rougher surfaces get SSR without RT. One could also argue that the image on the right has it's advantages. Perhaps you prefer the mirror like reflections or perhaps the one or two objects that SSR misses but RT doesn't really stands out to you. Either way, it seems like at best RT is more of an even trade off when implemented into a game that doesnt intentionally gimp the visuals to look worse than they should without RT. Even if you ignore the performance, it still has it's pros and cons with how it is used in a very limited extent today.

On a different note, there are other way more interesting implementations for it. I think so far Minecraft RTX looks the most impressive, but again that isn't even a fair comparison because without RTX Minecraft uses very basic lighting that no game meant to have decent visuals would use, but at least the way RT is used in that game really adds something to the experience.

So TLDR, I think reflections in real life are generally pretty dull. RTX games are being designed around how to best use mirror like reflections with a very strict roughness cut-off and the results are not at all a clear win in large part because those type of reflections don't look very natural.
 

·
Overclocker
Joined
·
11,684 Posts
It actually took me a little while to realize that the pics you posted were not duplicates. Yes the difference is there and anyone can tell which is which when looking for the differences but in general I don't see it as a very big difference. Don't get me wrong, if it were a lesser performance hit I would for sure want to use it myself but I don't see the performance hit as being worth it generally speaking for most of the games out with RT right now. Real life is never nearly as full of reflections as these newer games are, and even when life does give you reflections they are almost never very clear or noticeable. Reflections in real life are usually more like theses:
Not a problem to simulate that in software, or in realtime, it's artist's decision in how they want the reflection to look.


In RTX games reflections look overly mirror like, probably because the rougher a reflection is the higher the performance hit so RTX games tend to all look like they recently had the floors waxed or like it just rained and there are puddles everywhere but also the sun is out to help create nice reflections. If you look at Control, that game perhaps more than any other has floors way more shiny than any real industrial area would but in real life people don't encounter a lot of super shiny floors and especially not in places like the setting for Control.
Artist's decision. It doesn't cost more to make it rough but it can somewhat be optimized when the directions are same or similar, share it across frames and so on.

Control has a lot of polished stone floors and otherworldly areas of literally polished stones. Glass windows. So of course it reflects a lot and it actually looks decent unlike as you say the BF/CoD/whatever AAA mega game that gotta have the latest cool thing so they slap on 1-2 effects such as reflections and add mirror puddles because they don't have wind or other physics to distort the puddle to look more real let alone use dirty water in their texture for it. Artist's decision.

As far as semi reflectiveness goes, they don't want to use it much because it costs same as regular reflection but it's visual impact is much lower so they optimize them out. Aka BF5. This turns the effect into ON or OFF and not much in between.

I think Watch Dogs legion is a good example of how little impact RT reflections can really make. It seems like a lot of games (BF5 especially) intentionally nerfed the visuals with RTX turned off in ways that go beyond just RT. BF5 straight up has missing lights and shadows in some places without RT so it's not a good on vs off comparison. I think we can all agree that as far as RT reflections go, when RT is not being used of a reflection surface screen space reflections should be used as a cheaper alternative and that's exactly what happens in WD:Legion. That game didn't go to any extra effort to make RTX stand out more than it should and in the end, it really doesnt stand out much at all. Without RT, the usage of cube maps and screen space reflections works well enough to make it look like everything that should be reflective is believably reflective to about the same degree that RTX does.
Most games have a mix of reflections especially when not using RT entirely. Again artist's decision. SS, cube map, probes, voxels etc. sort of approximations of various effects used when using RT.

To me the incorrect reflections use in cube maps stand out immensely, even screen space goes to hell very often. It works OKish for games with low amount of reflective materials (say Metro when you're not near water) but it certainly doesn't work for Control.

I don't think it's hard to make the case that the RTX off in this example actually looks better. The softer reflection on the water arguably looks more believable than the mirror like on on the right. Additionally the left image actually has a few more reflections than the right image, probably because the roughness of a surface isn't a performance issue with screen space reflections like it is with RT so the rougher surfaces get SSR without RT. One could also argue that the image on the right has it's advantages. Perhaps you prefer the mirror like reflections or perhaps the one or two objects that SSR misses but RT doesn't really stands out to you. Either way, it seems like at best RT is more of an even trade off when implemented into a game that doesnt intentionally gimp the visuals to look worse than they should without RT. Even if you ignore the performance, it still has it's pros and cons with how it is used in a very limited extent today.
I don't really care for cherry picked marketing offline renders done in who knows what maybe even not the game itself. Again it's up to artists do it right or screw it up no matter the technique used. At least with RT it takes the work from artists a lot and even when they mess up at least the reflection still is geometrically correct.

Maybe some gimped, some don't, in the end there is a visible difference between the RT and non RT effects, plus with RT it's all dynamic not only prebaked static rubbish.

On a different note, there are other way more interesting implementations for it. I think so far Minecraft RTX looks the most impressive, but again that isn't even a fair comparison because without RTX Minecraft uses very basic lighting that no game meant to have decent visuals would use, but at least the way RT is used in that game really adds something to the experience.
In voxel games the RT problems can be simplified and optimized a lot, that's why they work so well for it, they have the performance and look prehistoric using their regular non RT renderer.

So TLDR, I think reflections in real life are generally pretty dull. RTX games are being designed around how to best use mirror like reflections with a very strict roughness cut-off and the results are not at all a clear win in large part because those type of reflections don't look very natural.
Artist's decision. It's not fault of the technology but of people using it. Not all games, actually almost no game aims to look realistic to begin with.

In more decent games where modding still exists, with RT, you could alter the textures, the reflectiveness, roughness or even recode shaders and so on. It depends on a game how it's been made and what user control it gives you.

The performance problem right now is up to the hardware dedicated to RT being very low. They still want to cater a lot to traditional rendering and if they made a proper 90% RT 10% compute RT focused card... it would be a whole lot different experience but the only software capable of using such a card right now are offline renderers as games still rely heavily on compute instead, except Q2RTX path tracing which has a lot of RT.

All the games look unreal to me, no matter if they use RT or not, at least with RT the effects are actually geometrically correct and not a fake rubbish at worst also a prebaked static one. Don't even get me started on bloom, DOF, motion blur, fake HDR, and so on.

RT is the way to go and where real time rendering will move to sooner or later.
 

·
Graphics Junkie
Joined
·
2,797 Posts
Artist's decision. It doesn't cost more to make it rough but it can somewhat be optimized when the directions are same or similar, share it across frames and so on.
This is 100% wrong. The roughness or a surface absolutely impacts the RT performance and the rougher a surface the more of a performance impact it will have. This is why watch dogs legion has more strick threshold for roughness cut off of RT on surfaces on consoles than it does on PC.

You gotta get the premise right to argue the rest of what you wanted to say about RT. Absolutely for certain the roughness of a surface matters a lot for performance.

I'm not arguing that RT isn't going to eventually be the golden standard for rendering lighting and reflections in games. I'm just arguing that as it is now with current hardware the limited RT we get now is only mildly better looking than other forms of reflections IMO, and it's no where near worth the performance cost especially when the developer does a good job on the non-RT effects. Sure if you stop to look at the reflections specifically, you can easily find the limitations of more basic reflections but the same is still true for RT reflections so it's just a minor improvement for a major performance hit IMO.
 

·
Registered
Joined
·
3,471 Posts
Global illumination is the best application for ray-tracing... unless you're going for a cartoony looking game.
 
  • Rep+
Reactions: UltraMega

·
waifu for lifu
Joined
·
11,544 Posts
Discussion Starter #80
I dont care too much about RT but i fear AMD GPUs will lack optimization with this game
We'll know soon enough. Dec 9th is launch day. I wonder what version of DLSS the game will use? I messed around with version 1 and I didnt like it.
 
61 - 80 of 86 Posts
Top