speaking of GodFall, are there any RT benchmarks?
Hmm, OK. "very slightly better looking than screen space reflections"More and more I see ray tracing as being over rated. A massive performance hit just to have reflections that are very slightly better looking than screen space reflections. It just doesn't seem worth it. Even if I had a 3090, I'd probably still choose to turn off RT to get better performance given how minor the visual impact is.
They unquestionably are, and no one is disputing that.could it just be that Nvidia maybe just maybe Nvidia is better than AMD at raytracing at the moment ?
This differential is how targeted sabotaging of performance can work.I'm not sure I understand what you're saying here. In hybrid RT workloads (Port Royale, Control, Watch Dogs: Legion, Wolfenstein: Youngblood, etc.), the 3080 wins in almost all cases. In FULL RT or Path Traced scenarios (3DMark DirectX Ray Tracing Feature Test), the 3080 is nearly 2x faster than the 6800 XT.
Nvidia has had extra 6-8 months for RT with CP2077.Look at how poorly HairWorks™ works even on nVidia's GPUs. Now look at the TressFX AMD implemented. TressFX has like no performance impact.
nVidia develops these middle-ware solutions to benefit their cards. If their GPUs were good at baking cakes, nVidia would pay developers to make their games bake cakes.
nVidia has a habbit of making sub-optimal solutions. One reason is to make AMD's cards look bad. The other HUGE part is to get you to buy a faster nVidia card. The odds of nVidia stopping this sort of thing NOW would be more suspicious than if they kept on doing it.
In the end, it's up to CDPR. I have a feeling, no matter what nVidia says, implementing ray tracing in Cyberpunk is way more involved than slapping HairWorks™ into the Witcher 3. Because of this, CDPR might really optimize their RT approach.
Well if that's the case then I hope CDPR gets taken to the woodshed like the godfall dev.RTX is just a brand name, it still uses DxR. Only proprietary ray trace renderer Nvidia has is OptiX and that's mainly used for professional applications.
There's no reason DxR wouldn't work on cards that support DxR. Only reason I could see causing the delay is that they're waiting for AMD's enhanced DxR libraries to make use of the 6000 series dedicated RT hardware.
For all we know, they could have some exclusivity deal with Nvidia for a short period. Who knows.Well if that's the case then I hope CDPR gets taken to the woodshed like the godfall dev.
The CDPR fanboyism is probably too strong for that to happen.
That is a silly comparison, and a false one at that.Look at how poorly HairWorks™ works even on nVidia's GPUs. Now look at the TressFX AMD implemented. TressFX has like no performance impact.
Just want to point out that Crystal Dynamics based their Pure Hair off of the open source TressFx.That is a silly comparison, and a false one at that.
Hairworks runs on everything in a scene. From animals to NPCs to the character you play. That is why there is a bigger impact, because there is more to do.
With witcher 3, initial performance hit with hairworks was huge, but after 2 months and a big patch, it had very small relative performance hit (15%).
TressFX works only on the main character you play, and only on that character. And even then it took them ages to make the hair stop flapping about for no apparent reason.
And above that, if you actually look at performance reviews, tressfx had 30%+ performance hit for several years. Later they had a more moderate performance hit (10%) when they released their latest variation (which basically no one used even after they gave it all for free and tried to intergrate it to unreal engine), and that was because they toned it down by a lot.
Tressfx was a huge failure that AMD used to claim it was "open source" which in truth it was a close source for 3 full versions until they realized no one was using it, so they opened as they lose the fight with gameworks.
Hmm what are you smoking?Just want to point out that Crystal Dynamics based their Pure Hair off of the open source TressFx.
It actually took me a little while to realize that the pics you posted were not duplicates. Yes the difference is there and anyone can tell which is which when looking for the differences but in general I don't see it as a very big difference. Don't get me wrong, if it were a lesser performance hit I would for sure want to use it myself but I don't see the performance hit as being worth it generally speaking for most of the games out with RT right now. Real life is never nearly as full of reflections as these newer games are, and even when life does give you reflections they are almost never very clear or noticeable. Reflections in real life are usually more like theses:Hmm, OK. "very slightly better looking than screen space reflections"
If you can't figure out which one is which, then yes good graphics are more than wasted on your eyes.
In Control in particular the reflections are a major difference between traditional fakes and traced.
And don't blame me for the butchered resolution and quality of OCN's image uploads it seem now I've uploaded 1440p near lossless that fits in the 20MB limit it was angry about. Yet it shows back some 1080p compressed what ever it is supposed to be.
Ah figured it out, if you want to see the uploaded image you have get the image link and delete from it the resolution and downscaler, then it will show you the 1440p 99% quality JPG files that were uploaded or at least 1440p.
OCN's new user interface... bad.
Not a problem to simulate that in software, or in realtime, it's artist's decision in how they want the reflection to look.It actually took me a little while to realize that the pics you posted were not duplicates. Yes the difference is there and anyone can tell which is which when looking for the differences but in general I don't see it as a very big difference. Don't get me wrong, if it were a lesser performance hit I would for sure want to use it myself but I don't see the performance hit as being worth it generally speaking for most of the games out with RT right now. Real life is never nearly as full of reflections as these newer games are, and even when life does give you reflections they are almost never very clear or noticeable. Reflections in real life are usually more like theses:
Artist's decision. It doesn't cost more to make it rough but it can somewhat be optimized when the directions are same or similar, share it across frames and so on.In RTX games reflections look overly mirror like, probably because the rougher a reflection is the higher the performance hit so RTX games tend to all look like they recently had the floors waxed or like it just rained and there are puddles everywhere but also the sun is out to help create nice reflections. If you look at Control, that game perhaps more than any other has floors way more shiny than any real industrial area would but in real life people don't encounter a lot of super shiny floors and especially not in places like the setting for Control.
Most games have a mix of reflections especially when not using RT entirely. Again artist's decision. SS, cube map, probes, voxels etc. sort of approximations of various effects used when using RT.I think Watch Dogs legion is a good example of how little impact RT reflections can really make. It seems like a lot of games (BF5 especially) intentionally nerfed the visuals with RTX turned off in ways that go beyond just RT. BF5 straight up has missing lights and shadows in some places without RT so it's not a good on vs off comparison. I think we can all agree that as far as RT reflections go, when RT is not being used of a reflection surface screen space reflections should be used as a cheaper alternative and that's exactly what happens in WD:Legion. That game didn't go to any extra effort to make RTX stand out more than it should and in the end, it really doesnt stand out much at all. Without RT, the usage of cube maps and screen space reflections works well enough to make it look like everything that should be reflective is believably reflective to about the same degree that RTX does.
I don't really care for cherry picked marketing offline renders done in who knows what maybe even not the game itself. Again it's up to artists do it right or screw it up no matter the technique used. At least with RT it takes the work from artists a lot and even when they mess up at least the reflection still is geometrically correct.I don't think it's hard to make the case that the RTX off in this example actually looks better. The softer reflection on the water arguably looks more believable than the mirror like on on the right. Additionally the left image actually has a few more reflections than the right image, probably because the roughness of a surface isn't a performance issue with screen space reflections like it is with RT so the rougher surfaces get SSR without RT. One could also argue that the image on the right has it's advantages. Perhaps you prefer the mirror like reflections or perhaps the one or two objects that SSR misses but RT doesn't really stands out to you. Either way, it seems like at best RT is more of an even trade off when implemented into a game that doesnt intentionally gimp the visuals to look worse than they should without RT. Even if you ignore the performance, it still has it's pros and cons with how it is used in a very limited extent today.
In voxel games the RT problems can be simplified and optimized a lot, that's why they work so well for it, they have the performance and look prehistoric using their regular non RT renderer.On a different note, there are other way more interesting implementations for it. I think so far Minecraft RTX looks the most impressive, but again that isn't even a fair comparison because without RTX Minecraft uses very basic lighting that no game meant to have decent visuals would use, but at least the way RT is used in that game really adds something to the experience.
Artist's decision. It's not fault of the technology but of people using it. Not all games, actually almost no game aims to look realistic to begin with.So TLDR, I think reflections in real life are generally pretty dull. RTX games are being designed around how to best use mirror like reflections with a very strict roughness cut-off and the results are not at all a clear win in large part because those type of reflections don't look very natural.
This is 100% wrong. The roughness or a surface absolutely impacts the RT performance and the rougher a surface the more of a performance impact it will have. This is why watch dogs legion has more strick threshold for roughness cut off of RT on surfaces on consoles than it does on PC.Artist's decision. It doesn't cost more to make it rough but it can somewhat be optimized when the directions are same or similar, share it across frames and so on.