[OC3D]Crytek Showcases Real-Time Ray Traced Reflections in CryEngine on RX Vega 56 - Page 9 - Overclock.net - An Overclocking Community

Forum Jump: 

[OC3D]Crytek Showcases Real-Time Ray Traced Reflections in CryEngine on RX Vega 56

Reply
 
Thread Tools
post #81 of 125 (permalink) Old 03-19-2019, 08:42 AM
Frog Blast The Vent Core
 
Join Date: Jan 2014
Posts: 5,983
Rep: 370 (Unique: 183)
Quote: Originally Posted by ToTheSun! View Post
RT cores are specialized at doing raytracing and nothing else. "Software accelerated" might not be the most accurate nomenclature, but the point of the distinction is that the GPU is not specifically and exclusively built for raytracing, in the same way that RT cores are.

It's also a way to differentiate them in regard to performance expectation. That's what Mand12 meant. Casually mocking nVidia's hardware solution on the basis that it's considered superfluous is missing the forest for the trees.
And neither is Pascal built for it, even though another thread is talking about how their new drivers are adding it. What a shock it will be when Pascal RT sucks compared to using hardware designed for it.

The mocking is mostly what I was mocking. Fine-tuned hardware does its job better than anything else, in this field.
Mand12 is online now  
Sponsored Links
Advertisement
 
post #82 of 125 (permalink) Old 03-19-2019, 09:28 AM
New to Overclock.net
 
Join Date: Aug 2015
Posts: 1,886
Rep: 268 (Unique: 115)
Quote: Originally Posted by ToTheSun! View Post
RT cores are specialized at doing raytracing and nothing else. "Software accelerated" might not be the most accurate nomenclature, but the point of the distinction is that the GPU is not specifically and exclusively built for raytracing, in the same way that RT cores are.

It's also a way to differentiate them in regard to performance expectation. That's what Mand12 meant. Casually mocking nVidia's hardware solution on the basis that it's considered superfluous is missing the forest for the trees.
RT cores are ASIC cores. But that doesn't mean the GPU was exclusively built for Ray Tracing. If that was the case, what are the Tensor cores doing there? What are the CUDA cores still doing there? If they really wanted a GPU specifically for RT, they would basically put mainly the ASIC there, and just enough CUDA/Tensor cores for the most basic geometry and denoising calculations. This is a transitory GPU with a step towards Ray Tracing.
nVidia's hardware 'solution' is the tackling of their own weakness in their own cards, which was compute power. Ray Tracing is a computationally heavy rendering technique. Their white paper states that the RT cores offload the CUDA/SM cores to do other work while the RT cores (or the ASIC) handles the ray tracing calculation. That is all fine, and it's a good thing for them to tackle.

People need to start thinking for themselves instead of looking at everything through the lens of nVidia. nVidia's solution is for their own cards, but people seem to think that because only they have that solution, everyone else is incapable. That is false. AMD is COMPLETELY different here. Have any of you ever wondered why AMD has many more flops, but still less performance in games? Rather than sweeping it under the rug by saying it's drivers or an inefficient architecture, I'll tell you why.

AMD does not have this issue of their stream processors being saturated and requiring offloading. AMD's compute power is above nVidia's in general. Why do you think miners flocked to AMD's cards during the mining boom? In fact, in terms of compute, the Radeon VII is the equivalent of the 2080Ti. Yes. Really. It doesn't translate into games, because games simply are not compute heavy. You could argue that it is an inefficient architecture, and it is, for rasterization. Not for compute.
Their stream processors don't need to be offloaded to do ray tracing, because in games, many of them are idling anyway, which is the reason why in the case of Vega 56 and Vega 64, they perform EXACTLY the same in games if they run at the exact same clock speeds. The additional 512 stream processors (which is close to 20% additional compute power) are literally doing NOTHING in games. That is without accounting how many idling stream processors there are within the total of 3584 in Vega 56. Who knows how many there are in the likes of the Radeon VII.

That's where the ACEs come in. They were designed to allow those idling stream processors to be used, in parallel to all the others already being used. All that idling power can be used to get ray tracing to work, without reducing current performance, because it's specifically using the idle stream processors. Now, we all know that would not be enough to implement ray tracing, which is why I'll go one step further with you. All the power that is used to do traditional shading techniques, that all comes free when you turn those off to do ray tracing instead. In other words, by lowering the amount of traditional rendering, you free up resources and thus more stream processors to increase ray tracing performance. And remember that AMD's cards are considered inefficient at those types of rendering techniques....

Also, the issue that nVidia mentions about thousands of instructions needed with Pascal for the ray tracing calculations... That is relevant because nVidia didn't have a hardware scheduler. AMD has multiple hardware schedulers in their cards, making that also a moot point. They can handle the stream of instructions and assign them efficiently to any idling stream processor through the ACEs. So I repeat... Stop looking at everything through the lens of nVidia.

That does not mean that an ASIC specifically for ray tracing would not help AMD's cards. ASICs for ray tracing would help everyone and everything. One can even put them on CPU cores if one so desires and eliminate the need for a ray tracing GPU. But, making use of those idling stream processors in AMD cards is practically a necessity before they go there. Why would they put additional hardware in there, while over 20% of the current compute power is not being used? Everything is already there to harness that power, and ray tracing is one of the best suited techniques to leverage GCN.
NightAntilli is offline  
post #83 of 125 (permalink) Old 03-19-2019, 09:43 AM
Newb to Overclock.net
 
mouacyk's Avatar
 
Join Date: Jan 2013
Posts: 3,738
Rep: 164 (Unique: 122)
Quote: Originally Posted by NightAntilli View Post
WOT
I suppose it makes sense to leverage AMD's abundantly idle hardware. Why was AMD idling for so long? It can't be that they don't know how to program their own hardware and had to wait for Crytek?

CPU: Q6600 X5470 2100T 1680V2 9900K 5950HQ
GPU: 8800GT 9800GTX+ 750Ti 1080Ti
RAM: 2x8GBTZ
Gentoo64 in Water
(14 items)
LGA775 X5470
(6 items)
CPU
9900K 5GHz 1.224v
Motherboard
EVGA Z370 Micro
GPU
MSI 1080TI GXEK 2100.5/12627
RAM
16GB Trident Z 4000C16
Hard Drive
970 EVO 500GB
Power Supply
Seasonic X850 Gold
Cooling
480mm Radiator Custom
Case
Silverstone FT03
Operating System
Windows 7 Ultimate 64-bit
Operating System
Gentoo Linux 64 Multi-Lib
Monitor
Acer Predator XB271UH 165Hz
Keyboard
Logitech G710+
Mouse
Logitech G502
Audio
Sound Blaster Z
CPU
X5470 4GHz (stock v)
Motherboard
GA-EP45-UD3P
GPU
EVGA 9800 GTX+ 512MB
RAM
8GB 4x2GB GSkill 1066MHz DDR2
Cooling
XSPC Rasa, D5 + Res, 240mm Rad
Case
Lian-Li PC7-HX
▲ hide details ▲
mouacyk is offline  
Sponsored Links
Advertisement
 
post #84 of 125 (permalink) Old 03-19-2019, 09:46 AM
mfw
 
ToTheSun!'s Avatar
 
Join Date: Jul 2011
Location: Terra
Posts: 6,270
Rep: 362 (Unique: 191)
Quote: Originally Posted by NightAntilli View Post
Spoiler!
Sure, but how does any of that relate to the fact that some people seem to be implying RT cores are not doing anything of importance and/or impact? You do say so yourself, but Mand12 and I were replying specifically to someone who was implying otherwise.

Now, the extra compute that modern AMD cards have, which no one is trying to obscure, can be used for compute heavy workloads and, in that way, run applications more efficiently than Pascal cards. Because of this, on the cusp of raytracing integration, nVidia decided to add ASIC's. That's why Turing cards are better than previous gens at raytracing and denoising.

AMD is on the record saying they don't believe their current hardware would produce satisfactory performance for real-time raytracing, even with all the extra compute capability their cards have over Pascal.

So, how does all of this not give RT cores legitimacy to exist in the current paradigm of development? You agree with my view, but, again, the original comments were not directed at you.

CPU
Intel 6700K
Motherboard
Asus Z170i
GPU
MSI 2080 Sea Hawk X
RAM
G.skill Trident Z 3200CL14 8+8
Hard Drive
Samsung 850 EVO 1TB
Hard Drive
Crucial M4 256GB
Power Supply
Corsair SF600
Cooling
Noctua NH C14S
Case
Fractal Design Core 500
Operating System
Windows 10 Education
Monitor
ViewSonic XG2703-GS
Keyboard
Ducky One 2 Mini
Mouse
Corepadded Logitech G703
Mousepad
Asus Scabbard
Audio
Fiio E17K v1.0 + Beyerdynamic DT 1990 PRO (B pads)
▲ hide details ▲
ToTheSun! is offline  
post #85 of 125 (permalink) Old 03-19-2019, 11:04 AM
New to Overclock.net
 
Join Date: Aug 2015
Posts: 1,886
Rep: 268 (Unique: 115)
Quote: Originally Posted by ToTheSun! View Post
Sure, but how does any of that relate to the fact that some people seem to be implying RT cores are not doing anything of importance and/or impact? You do say so yourself, but Mand12 and I were replying specifically to someone who was implying otherwise.

Now, the extra compute that modern AMD cards have, which no one is trying to obscure, can be used for compute heavy workloads and, in that way, run applications more efficiently than Pascal cards. Because of this, on the cusp of raytracing integration, nVidia decided to add ASIC's. That's why Turing cards are better than previous gens at raytracing and denoising.

AMD is on the record saying they don't believe their current hardware would produce satisfactory performance for real-time raytracing, even with all the extra compute capability their cards have over Pascal.

So, how does all of this not give RT cores legitimacy to exist in the current paradigm of development? You agree with my view, but, again, the original comments were not directed at you.
Does anything other than the RTX 2080 Ti give satisfactory performance for real-time ray tracing? And it's still hybrid ray tracing... So... Yeah. I can fully understand why AMD makes such a statement. AMD does not have many resources, and they are not going to put resources into this, especially when nVidia themselves are disappointed by the sales of the RTX cards. And it makes sense that they don't want to segment their GPUs, with some being capable and some not being capable. Consumers have enough aversion against them as it is.

Quote: Originally Posted by mouacyk View Post
I suppose it makes sense to leverage AMD's abundantly idle hardware. Why was AMD idling for so long? It can't be that they don't know how to program their own hardware and had to wait for Crytek?
My answer above to ToTheSun applies to your question as well. Radeon Rays has been a thing since at least 2015... But it was simply not adopted for games. And hardware is still not really ready. Look at what RTX is doing. In BFV, RT was done specifically for reflections. In Metro Exodus, it was done specifically for global illumination. What would happen if you want to put those two together? Or more of them, like an ambient occlusion effect? No cards can do that as of now.

Last edited by NightAntilli; 03-19-2019 at 11:19 AM.
NightAntilli is offline  
post #86 of 125 (permalink) Old 03-19-2019, 11:16 AM
Newb to Overclock.net
 
mouacyk's Avatar
 
Join Date: Jan 2013
Posts: 3,738
Rep: 164 (Unique: 122)
Makes sense AMD doesn't want to jump in the deep end. It's not ripe for the picking yet. Gotcha. (holy $1300 for acceptable partial hybrid ray tracing ... yeah)

Gentoo64 in Water
(14 items)
LGA775 X5470
(6 items)
CPU
9900K 5GHz 1.224v
Motherboard
EVGA Z370 Micro
GPU
MSI 1080TI GXEK 2100.5/12627
RAM
16GB Trident Z 4000C16
Hard Drive
970 EVO 500GB
Power Supply
Seasonic X850 Gold
Cooling
480mm Radiator Custom
Case
Silverstone FT03
Operating System
Windows 7 Ultimate 64-bit
Operating System
Gentoo Linux 64 Multi-Lib
Monitor
Acer Predator XB271UH 165Hz
Keyboard
Logitech G710+
Mouse
Logitech G502
Audio
Sound Blaster Z
CPU
X5470 4GHz (stock v)
Motherboard
GA-EP45-UD3P
GPU
EVGA 9800 GTX+ 512MB
RAM
8GB 4x2GB GSkill 1066MHz DDR2
Cooling
XSPC Rasa, D5 + Res, 240mm Rad
Case
Lian-Li PC7-HX
▲ hide details ▲
mouacyk is offline  
post #87 of 125 (permalink) Old 03-19-2019, 11:31 AM
sudo apt install sl
 
WannaBeOCer's Avatar
 
Join Date: Dec 2009
Posts: 4,209
Rep: 143 (Unique: 104)
Quote: Originally Posted by NightAntilli View Post
Does anything other than the RTX 2080 Ti give satisfactory performance for real-time ray tracing? And it's still hybrid ray tracing... So... Yeah. I can fully understand why AMD makes such a statement. AMD does not have many resources, and they are not going to put resources into this, especially when nVidia themselves are disappointed by the sales of the RTX cards. And it makes sense that they don't want to segment their GPUs, with some being capable and some not being capable. Consumers have enough aversion against them as it is.


My answer above to ToTheSun applies to your question as well. Radeon Rays has been a thing since at least 2015... But it was simply not adopted for games. And hardware is still not really ready. Look at what RTX is doing. In BFV, RT was done specifically for reflections. In Metro Exodus, it was done specifically for global illumination. What would happen if you want to put those two together? Or more of them?
nVidia releases hardware capable of gaming using Ray Tracing and everyone shouts "it's not full Ray Tracing" of course it's not. nVidia has been talking about their expectations with hybrid Ray Tracing in the early 2000s and finally released their API OptiX in 2008. It's called progress and it cost a decent amount to create this technology. No one is hiding that it's a hybrid Ray Tracing method since it's demanding. No one is forcing you to buy it but games sure do look better with it.

https://devblogs.microsoft.com/direc...tx-raytracing/

Quote:
Eventually, raytracing may completely replace rasterization as the standard algorithm for rendering 3D scenes. That said, until everyone has a light-field display on their desk, rasterization will continue to be an excellent match for the common case of rendering content to a flat grid of square pixels, supplemented by raytracing for true 3D effects.
The demo that Crytek created uses SVOGI that a nVidia researcher created named Cyril Crassin. I'm sure there is a good reason why they dropped voxel based for BVH and I'm sure AMD had a word in the decision for BVH when they worked with Microsoft to add DxR.

http://on-demand.gputechconf.com/gtc...lumination.pdf
https://blog.icare3d.org/2012/06/unr...l-time-gi.html

Some input from random people: http://ompf2.com/viewtopic.php?t=166

Maximus
(22 items)
CPU
Core i7 6700K 4.8Ghz @ 1.4v
Motherboard
Maximus VIII Formula
GPU
Radeon VII @ 2100Mhz/1200Mhz w/ 1150mV
RAM
G-Skill 32GB 3200Mhz
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 500GB
Power Supply
EVGA SuperNova 1200w P2
Cooling
EK Supremacy Full Copper Clean
Cooling
XSPC D5 Photon v2
Cooling
Black Ice Gen 2 GTX360 x2
Cooling
EK-Vector Radeon VII - Copper + Plexi
Case
Thermaltake Core X5 Tempered Glass Edition
Operating System
Kubuntu 18.04.1
Operating System
Hackintosh macOS 10.14.3
Operating System
Windows 10 Pro
Monitor
Acer XF270HUA
Keyboard
Cherry MX Board 6.0
Mouse
Logitech G600
Mouse
Alugraphics GamerArt
Audio
Definitive Technology Incline
Audio
SMSL M8A
▲ hide details ▲
WannaBeOCer is online now  
post #88 of 125 (permalink) Old 03-19-2019, 01:05 PM
New to Overclock.net
 
Join Date: Aug 2015
Posts: 1,886
Rep: 268 (Unique: 115)
Quote: Originally Posted by WannaBeOCer View Post
nVidia releases hardware capable of gaming using Ray Tracing and everyone shouts "it's not full Ray Tracing" of course it's not. nVidia has been talking about their expectations with hybrid Ray Tracing in the early 2000s and finally released their API OptiX in 2008. It's called progress and it cost a decent amount to create this technology. No one is hiding that it's a hybrid Ray Tracing method since it's demanding. No one is forcing you to buy it but games sure do look better with it.

https://devblogs.microsoft.com/direc...tx-raytracing/



The demo that Crytek created uses SVOGI that a nVidia researcher created named Cyril Crassin. I'm sure there is a good reason why they dropped voxel based for BVH and I'm sure AMD had a word in the decision for BVH when they worked with Microsoft to add DxR.

http://on-demand.gputechconf.com/gtc...lumination.pdf
https://blog.icare3d.org/2012/06/unr...l-time-gi.html

Some input from random people: http://ompf2.com/viewtopic.php?t=166
You'd be surprised how many times I read replies in forums where people think the RTX games are fully ray traced.

Two of your links are dead btw.
NightAntilli is offline  
post #89 of 125 (permalink) Old 03-19-2019, 03:17 PM
New to Overclock.net
 
Zenairis's Avatar
 
Join Date: Jan 2015
Posts: 18
Rep: 0
Quote: Originally Posted by PontiacGTX View Post
Quote: Originally Posted by UltraMega View Post
Well this just make Nvidia look extra silly. Branding ray tracing as an Nvidia only feature was a mistake for Nvidia that will bite them later on... since its... ya know... not.
Like physx?see how other engines or even havok achieve similar result without using nvidia propietary technology
While this is true that they achieve it to a certain extent they are no where near as remotely efficient with it. No AMD architecture base game has proven that it can compete with Witcher 3, FFXV or Metro Exodus in terms of effects and in the first twos case hair, grass and physics related effects of the sort. Sure almost any GPU can draw rays. But can it do it in real time in high volume with a large number of rays. That is the question
Zenairis is offline  
post #90 of 125 (permalink) Old 03-19-2019, 09:38 PM
MegaTechPC
 
Majin SSJ Eric's Avatar
 
Join Date: Apr 2011
Location: Saint Simons Island, GA
Posts: 19,204
Rep: 1091 (Unique: 517)
Quote: Originally Posted by Mand12 View Post
And neither is Pascal built for it, even though another thread is talking about how their new drivers are adding it. What a shock it will be when Pascal RT sucks compared to using hardware designed for it.

The mocking is mostly what I was mocking. Fine-tuned hardware does its job better than anything else, in this field.
I suspect (with great cynicism) that the actual reason Nvidia has opened DxR up to Pascal cards in the upcoming driver updates is precisely because RTX sales are in the tank and they want to show everyone how "Horrible" Pascal is at rendering these ray-traced scenes. But, "if you want to unlock the REAL POWAHHHH of Nvidia's awesome RT implementation then all you need to do is buy one of these shiny new RTX cards and take your gaming to the next level!!!"

Look, RTX is a real hardware feature and it has real benefits over any non-hardware dedicated ray-tracing solution (even Jim from AdoredTV did a whole video in which he actually praised Nvidia for being very clever with their RTX hybrid RT strategy), but at the end of the day it doesn't really matter all that much to anyone looking for a new GPU today. Hardly any games actually support RTX (Still), the only card that provides truly acceptable RTX performance costs $1200+ (the 2080 is only capable at 1080p, and even then just barely), and the few games that do support RTX don't provide enough tangible eye-candy benefits over traditional rendering to be anywhere near worth the cost to frame rates (and there are issues with Nvidia's hybrid RT solution, as demonstrated HERE

In summation, I personally find RTX to be a fascinating technological achievement by Nvidia that probably needed more time to mature before being thrown out there onto an unsuspecting GPU market. Toss in the fact that they needlessly segmented their entire GPU lineup purely around this one somewhat gimmicky feature (RTX vs GTX cards), marketed RTX by insinuating that it was the "Only" way to implement ray-tracing in games, and created RTX SKU's like the 2060 that have no hope whatsoever of delivering satisfying FPS when actually utilizing the feature, and I think its obvious why sales have been lackluster and Nvidia has seen blow back from the community.


3DMark11 - P25138
3DMark Firestrike - P20998

Intel Core i7 4930K @ 4.7GHz | Asus Rampage IV Extreme | 2 x EVGA GTX Titan SC (1254MHz) | 16GB Patriot Viper Extreme DDR3 2133MHz (4 x 4GB) | Corsair AX1200 | Silverstone Temjin TJ11 | Corsair Force 3 240GB (System) | 2 x Intel 320 160GB SSD (Dedicated Gaming Drives) | Hitachi Deskstar 1TB (Data) | MS Windows 10 Pro | EK Supreme HF/FC-Titan/Rampage IV Extreme blocks | Hardware Labs GTX 560/240 rads | Alphacool VPP-655 D5 pump | Bitspower mod kit/pump top/fittings/120mm res


Majin SSJ Eric is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off