Originally Posted by UltraMega
You're giving Nvidia waaay too much credit. Other companies have messed with ray tracing a long time back. Microsoft adding it to DX12 would really be the catalyst right now.
As I understand it, the reason we have RTX is because Nvidia's Volta (or is it tensor?) cores didn't really work out well for normal gaming tasks but Nvidia needed to use them for something instead of throwing away all that RnD money so they tacked them onto the 12nm version of Pascal and called them RTX cores.
Ray Tracing has always been an inevitability for 3D rendering and it's always just been a matter of when can we have powerful enough GPUs to do it. RTX is not real ray tracing. Real RT is still a few years off at least.
Ray Tracing will eventually become the norm... at SOME point but we are a ways off. I can't confirm or deny Nvidia's intent but I am inclined to believe the above. Leveraging RnD spent to an relatively untapped market (in gaming).
That being said, Ray Tracing has been around for a long while. It just has not been well adapted into real time applications. Speaking from an Architectural rendering side, I have been utilizing Ray Tracing for quite a while (nearly 8 years).
The main benefit from my end was to avoid spending hours mimicking light sources. Architects bill hourly. Hours cost money. Faster the render; Higher the profit.
Ray Tracing effectively lights the scene "accurately" with ease. If you still wanted to get the utmost out of your render, you're still looking at placing your light sources manually and manipulating the scene to get the utmost correct bounces, reflections etc etc.
I think as Ray Tracing technology improves, as well has hardware efficiencies, it will only be matter of when, not if Ray Tracing becomes the norm.