Originally Posted by skupples
its funny when you think about it. Gaming is 13 years late to add Ray Tracing. Pixar started using it in Cars, which released in 2006, which means they were using the tech in the way early 2Ks... AMD bragged about it offhand during the DX9 days, then it went totally dormant until just now. RTX may be a fail for NV, but it kicked off the tech to the market, which really shows it simply wasn't happening any other way. Near fetus stage level tech, 13 years late... but hey! Now we're finally on track.
You're giving Nvidia waaay too much credit. Other companies have messed with ray tracing a long time back. Microsoft adding it to DX12 would really be the catalyst right now.
As I understand it, the reason we have RTX is because Nvidia's Volta (or is it tensor?) cores didn't really work out well for normal gaming tasks but Nvidia needed to use them for something instead of throwing away all that RnD money so they tacked them onto the 12nm version of Pascal and called them RTX cores.
Ray Tracing has always been an inevitability for 3D rendering and it's always just been a matter of when can we have powerful enough GPUs to do it. RTX is not real ray tracing. Real RT is still a few years off at least.
i7 7700kK @4.2ghz
16GB DDR4 3200mhz
GeForce 1080 Ti
Last edited by UltraMega; 06-19-2019 at 10:39 PM.