Overclock.net banner

81 - 100 of 136 Posts

·
To The Game
Joined
·
7,611 Posts
Thank lord for the forum overhaul, the old one was AWFUL and put lots of people off posting here. This looks a massive improvement, well done Overclock.net.

OT - Leaked reviews of the 3080 show it to actually be about 30% faster than the 2080 Ti, far less than the marketing tricked many people into believing. So I believe an RDNA2 GPU will be faster than the 3080 but fall short of the 3090.

I predict:

8GB 6700XT 10% slower than 3070 for $450 (likely a 5700XT replacement)
16GB 6800XT 10% slower than a 3080 for $600
20GB 6900XT 10-15% faster than a 3080 for $850
From everything I've read rdna2 is supposed to be around 15% faster then a 2080 ti. If a 3080 is 30% faster then a 2080 ti, how is rdna2 going to be 10 - 15% faster then a 3080? If rdna's top gpu comes with in 15% of the 3080 with 16 - 20gb vram that should be enough to at least shake things up.
 

·
mfw
Joined
·
8,621 Posts
From everything I've read rdna2 is supposed to be around 15% faster then a 2080 ti. If a 3080 is 30% faster then a 2080 ti, how is rdna2 going to be 10 - 15% faster then a 3080? If rdna's top gpu comes with in 15% of the 3080 with 16 - 20gb vram that should be enough to at least shake things up.
Nah. It has to be within way lower than 15% to make nVidia loyals consider it, even with 50% more VRAM.
 

·
Registered
Joined
·
5,093 Posts
Nah. It has to be within way lower than 15% to make nVidia loyals consider it, even with 50% more VRAM.
I don't think the Vram will be an issue so if its within 15% performance for at least 15% less then I would consider it. 1440p 144hz is my target.
 

·
To The Game
Joined
·
7,611 Posts
Nah. It has to be within way lower than 15% to make nVidia loyals consider it, even with 50% more VRAM.
Speaking as someone who is pretty set on Nvidia, It's going to take more then a single gen of good gpu's to push me towards buying amd for my main rig. I want to see a few good gens of competitive high end gpu's with solid software/drivers. Make competitive gpu's this gen, kill it next gen and I think they might start converting people.
 
  • Rep+
Reactions: wreckless

·
Registered
Joined
·
1,376 Posts
Big Navi sounds interesting folks. Lot of choices for us. Really is GTX2xx versus HD48xx. Can’t go wrong with either choice
 

·
Registered
Joined
·
5,093 Posts
Big Navi sounds interesting folks. Lot of choices for us. Really is GTX2xx versus HD48xx. Can’t go wrong with either choice
We need to see what Big Navi has in terms of Ray tracing hardware. Even with the close traditional performance if we start seeing a lot of ray traced games or things that take advantage of the tensor cores that could be a strong reason to get nvidia over AMD. I'm hoping AMD prices accordingly if they are still really far behind in that arena but close on traditional rasterization. I could probably live without ray tracing for cheaper if the performance was there.
 

·
Vandelay Industries
Joined
·
1,924 Posts
We need to see what Big Navi has in terms of Ray tracing hardware. Even with the close traditional performance if we start seeing a lot of ray traced games or things that take advantage of the tensor cores that could be a strong reason to get nvidia over AMD. I'm hoping AMD prices accordingly if they are still really far behind in that arena but close on traditional rasterization. I could probably live without ray tracing for cheaper if the performance was there.
AMD will use it's console leverage to have games with ray tracing there not be dependent on tensor cores. Nvidia will use it's market share and money to make PC only games using ray tracing have performance reliant on tensor cores.

Not having a standard sucks for us.
 

·
Registered
Joined
·
1,376 Posts
We need to see what Big Navi has in terms of Ray tracing hardware. Even with the close traditional performance if we start seeing a lot of ray traced games or things that take advantage of the tensor cores that could be a strong reason to get nvidia over AMD. I'm hoping AMD prices accordingly if they are still really far behind in that arena but close on traditional rasterization. I could probably live without ray tracing for cheaper if the performance was there.
I had an rtx2080ti and almost never used ray tracing.
 

·
WaterCooler
Joined
·
3,445 Posts
We need to see what Big Navi has in terms of Ray tracing hardware. Even with the close traditional performance if we start seeing a lot of ray traced games or things that take advantage of the tensor cores that could be a strong reason to get nvidia over AMD. I'm hoping AMD prices accordingly if they are still really far behind in that arena but close on traditional rasterization. I could probably live without ray tracing for cheaper if the performance was there.
I guess my question will be what effort is required by the part of developers to implement ray tracing via Nvidia's RTX, or RDNA2's implementation (speaking of, do we know how they are implementing it yet)? We know the consoles are using RDNA2, so I am inclined to think it'll catch on more than Nvidia's version. Guess time will tell.
 

·
sudo apt install sl
Joined
·
7,305 Posts
AMD will use it's console leverage to have games with ray tracing there not be dependent on tensor cores. Nvidia will use it's market share and money to make PC only games using ray tracing have performance reliant on tensor cores.

Not having a standard sucks for us.
AMD also uses an AI-accelerated denoiser for ray tracing which they currently use for Radeon ProRender/Image Filters. I don't get why you're hating on tensor cores for doing their job. It seems as though you want both architectures to be identical. Why wouldn't a company utilize their dedicated hardware?

Are you also hating on Intel/Qualcomm/Apple for introducing their own dedicated inference hardware/SDKs?

 

·
Registered
Joined
·
5,093 Posts
AMD will use it's console leverage to have games with ray tracing there not be dependent on tensor cores. Nvidia will use it's market share and money to make PC only games using ray tracing have performance reliant on tensor cores.

Not having a standard sucks for us.
From my understanding tensor cores are just good at the calculations needed for ray tracing and RTX is just an API to leverage them using microsofts API. I don't believe you will even need to use RTX to have ray tracing on nvidia gpus it just makes it easier. What is left to be seen is if AMD can match nvidias performance on ray tracing. I hope they do.


I had an rtx2080ti and almost never used ray tracing.
Yeah it definintely seemed like a bust this past gen, but some of the announcments have me hopeful it will catch on more especially with consoles using it. It is not my deciding factor but it is a factor in my purchase.

I guess my question will be what effort is required by the part of developers to implement ray tracing via Nvidia's RTX, or RDNA2's implementation (speaking of, do we know how they are implementing it yet)? We know the consoles are using RDNA2, so I am inclined to think it'll catch on more than Nvidia's version. Guess time will tell.
Both implementations will use Microsoft's API or Vulkan's, RTX from my understanding is another layer on top of that from Nvidia to easilly allow developers to tap into the hardware but it wouldn't be necessary. I have no idea on the AMD side though but I would imagine its similar they will just make theirs open source possibly to help adoption since they are behind Nvidia in this realm.
 

·
sudo apt install sl
Joined
·
7,305 Posts
Both implementations will use Microsoft's API or Vulkan's, RTX from my understanding is another layer on top of that from Nvidia to easilly allow developers to tap into the hardware but it wouldn't be necessary. I have no idea on the AMD side though but I would imagine its similar they will just make theirs open source possibly to help adoption since they are behind Nvidia in this realm.

Correct, RTX is just the platform which includes enhanced libraries for DxR, they also have their proprietary OptiX(AMD ProRender competitor).

The Khronos Group ended up using Nvidia's hardware agnostic extension called VKRay(vk_nv_ray_tracing) for the base for their official extension VK_KHR_ray_tracing.

RadeonRays has been out for quite some time and RadeonRays 4.0 was released in May which supports DX12/Vulkan which wasn't open source at launch then open source afterwards.

 

·
WaterCooler
Joined
·
3,445 Posts
Both implementations will use Microsoft's API or Vulkan's, RTX from my understanding is another layer on top of that from Nvidia to easilly allow developers to tap into the hardware but it wouldn't be necessary. I have no idea on the AMD side though but I would imagine its similar they will just make theirs open source possibly to help adoption since they are behind Nvidia in this realm.
Correct, RTX is just the platform which includes enhanced libraries for DxR, they also have their proprietary OptiX(AMD ProRender competitor).

The Khronos Group ended up using Nvidia's hardware agnostic extension called VKRay(vk_nv_ray_tracing) for the base for their official extension VK_KHR_ray_tracing.

RadeonRays has been out for quite some time and RadeonRays 4.0 was released in May which supports DX12/Vulkan which wasn't open source at launch then open source afterwards.

Awesome. Thanks for clearing that up for me. So should be fine either way then, just a matter of which hardware works the best for ray tracing (my guess would be Ampere due to the dedicated cores for it) and if it catches on with more titles. I imagine it would being in the new consoles.
 

·
sudo apt install sl
Joined
·
7,305 Posts
Awesome. Thanks for clearing that up for me. So should be fine either way then, just a matter of which hardware works the best for ray tracing (my guess would be Ampere due to the dedicated cores for it) and if it catches on with more titles. I imagine it would being in the new consoles.
The console specifications confirmed RDNA2 also has dedicated hardware for ray tracing to compete with Nvidia's RT cores. The Radeon VII's FP16 isn't terrible at all but I'm not sure of the inference performance. Maybe I'll run an inference benchmark to see how Turing/Vega 20 compare. I'm using a newer version of CUDA and drivers compared to the image I attached.

ResNet-50 Training with FP16:

ROCm 3.7/CUDA 11.0

My stock clocks Radeon VII WC using FP16 batch size of 128: total images/sec: 434.31
My stock clocks Titan RTX WC using FP16 batch size of 128: total images/sec: 808.52
 

Attachments

·
Registered
Joined
·
5,093 Posts
The console specifications confirmed RDNA2 also has dedicated hardware for ray tracing to compete with Nvidia's RT cores. The Radeon VII's FP16 isn't terrible at all but I'm not sure of the inference performance. Maybe I'll run an inference benchmark to see how Turing/Vega 20 compare. I'm using a newer version of CUDA and drivers compared to the image I attached.

ResNet-50 Training with FP16:

ROCm 3.7/CUDA 11.0

My stock clocks Radeon VII WC using FP16 batch size of 128: total images/sec: 434.31
My stock clocks Titan RTX WC using FP16 batch size of 128: total images/sec: 808.52
Yeah, I expect the next gen from AMD to have dedicated hardware, my concern is I don't see them matching nvidia out of the gate. Nvidia made a huge jump with their second round of dedicated hardware, i find it hard to believe AMD gets it as good the first time around.
 
  • Rep+
Reactions: SwitchFX

·
Registered
Joined
·
233 Posts
I think this means that they're PROBABLY not gonna actually launch until 2021. People have been wiating for half a decade for CyberPunk


I would EXPECT Zen5K to be 2022 honestly. Looks like Zen4K will be launching in October or November of this year, and we have ZERO evidence of DDR5 being tested on consumer-level hardware as of yet. Also, DDR5 is likely going to be extremely expensive for at least the first year of its consumer-level availability (IE probably not worth it @ release)
the gamersnexus rumor says early 2022 and it will support usb 4 and ddr5 iirc. Its also a new platform. Ill definitely be upgrading my graphics card this year, but CPU im not so sure yet as AM4 is nearing EOL
 

·
Registered
Joined
·
1,376 Posts
the gamersnexus rumor says early 2022 and it will support usb 4 and ddr5 iirc. Its also a new platform. Ill definitely be upgrading my graphics card this year, but CPU im not so sure yet as AM4 is nearing EOL
I think economically thats wise choice.
 

·
Registered
Joined
·
839 Posts
If AMD was going to be competitive with Nvidia 3000, they would be doing their reveal before the 3000 series release date to steal sales. I guess the rumors are right after all, rdna2 will be going into the value segment again.
 

·
waifu for lifu
Joined
·
11,083 Posts
Discussion Starter #99
If AMD was going to be competitive with Nvidia 3000, they would be doing their reveal before the 3000 series release date to steal sales. I guess the rumors are right after all, rdna2 will be going into the value segment again.
I agree that AMD marketing is horrid, but to conclude on their performance soley on rumors or lack thereof is horrid as well.
 

·
Registered
Joined
·
1,376 Posts
I agree that AMD marketing is horrid, but to conclude on their performance soley on rumors or lack thereof is horrid as well.
If you follow the youtube channels founders pricing helped us since amd has to price lower. AIB pricing for 3000 series is higher than founders by a lot.
 
81 - 100 of 136 Posts
Top