Overclock.net banner

RDNA 3: Radeon 7000 Series Pre/Post Day Discussion

46K views 848 replies 74 participants last post by  Rei86  
#1 · (Edited)

Image

Image



Image


Image


Image
















Below is yet to be confirmed



Not much is said about next gen Radeon cards yet.
However, it appears that the high end will be chiplet design.
And, will support DP 2.0. Offering 4K @ 240 Hz .
 
#2 ·
I hope it's good for gaming.

I also hope AMD continue to get their act together with ROCm/HIP. It's progressing, but they've had several false starts with GPU compute putting them a decade behind CUDA and oneAPI has come forward leaps and bounds in a very short time.
 
#9 ·
AMD absolutely need to focus on raytracing. It is the future of game rendering and they need to hop on the train ASAP before Nvidia leave them in the cold. We need to see 2-3x increase in raytracing performance from them in this coming generation for them to be competitive - its not about raster only anymore, especially in the $1000+ arena
 
#16 ·
The only way raytracing will become standard is if all three mainstream consoles mandate it as the sole rendering path.

While I can see Microsoft and Sony trying that for XBox Series (X+2) or PlayStation 7, I can't see Nintendo doing it. Nintendo almost appears to make it a point of pride to do more with less in hardware terms.
 
#22 ·
Not really.
Gonna wait and see how the performance from 3rd party testing.
The only thing thats disappointing is the week delay but whatever
 
#27 ·
It blows my mind to think about this. So you will have people who will be like I got 300fps but muh latency yo. There have been massive posts over the years of this issue before DLSS was a thing. Imagine the confusion people will have when they see 200fps and feel like they are playing at 60fps.

It boggles my mind that this is where we are with GPUs. Nvidia is literally selling a feature that was available on TVs from like 2012ish. They didn't have very powerful tech to do it then. So all of a sudden AI is needed and it's all cool to fake fps....

It's looking like AMD is going to fall for the Ray tracing hype now. I guess this is where we are with GPUs and rasterization isn't good enough selling point. Maybe the masses are sticking with low resolution displays and AMD/Nvidia need new features to sell the products.

So for a fully ray traced game we can expect less than 60fps unless we enable fake fps then we get 100fps and it feels like 30fps lol... This is some insane stuff... Whatever I suppose we will see how this cookie crumbles...
 
#30 · (Edited)
It's all 2022 indoctrination believing that an upscaler to be something more then what it really is. Sad part is that they will spend over $1600 for the privileged and reference it as status like owning a Bugatti. Not being able to register simple, common sense, rudimentary, self knowledge of the PC gaming. I just had someone in another thread proclaim that people don't buy 4090 for DLSS 3. Which only comes exclusively with 4090. :LOL::ROFLMAO:

When shown the obvious flaws of DLSS 3 and it's true worth (not $1600) it creates something called cognitive dissonance. In which they will form a mental barrier called confirmation bias:
  • Projection
  • Gaslighting
  • Denial
  • Ad hominem attacks
  • Diversion (e.g. changing the subject)
  • Straw man
  • Etc
To avoid discussing the true legitmacy of DLSS being exclusive on a $1600+ video card.

Having said that we know that AMD is going chiplets for their 7900xt and 7950xt. How fast this is going to be at rasterization remains to be seen. However, it wouldn't be far fetch to believe that it has the potential to beat a 4090. My concern, as addressed earlier is what kind of latency penalty, if at all, will there be going chiplet? Hopefully, that's been mitigated.
:coffee:
 
#31 ·
Honestly I would be all for ray tracing hardware and whatever else Nvidia / AMD hardware they put I the cores if it would be somehow dual use for normal rasterization. So instead of wasting die space with hardware that will just be idle or useless when not using features, it could contribute to normal rasterization somehow.(not counting upscalers)

This just feels like to me wasted performance when your not doing RTax or whatever. It reminds me of how I felt with Fuji/Vega and bad driver threading and asynch compute or whatever. I remember knowing the fps should be better but they had it hobbling along on one leg. The power consumption was weak when it still said it was 100% utilized and fps was lacking. Then running something that was more well threaded able to use it more properly and power consumption reflected such.

This frame generator buisness seems extremely strange coming from Nvidia. Like they know something we don't get in regards to AMDs coming product. So maybe your angle is more likely since if there is latency with MCM and massive fps. If this comes true, then who's doing it correctly, frame fakery, or raw fps with MCM and some latency.
 
#38 ·
Leaks say that new AMD's RDNA 3 RX 7900 XT reportedly packs 20GB of memory and it's flagship will be even faster. Though obviously that's a rumor but NVIDIA rumors were mostly true too. One of the leak sources: AMD's RDNA 3 RX 7900 XT reportedly packs 20GB of memory and isn't Radeon's next-gen flagship

To add to above discussion: I became pretty average enthusiast in last years (due to new games being dog poop and graphics alone won't make me like game) so for me right now the question is how much fps/$ I will get. I don't really care about RT as I just don't bother/notice stuff like that in games and DLSS3 is also something I don't care as I want to play games at comfortable FPS without having to rely on gimmick feature. Sure in 2 generations that DLSS3 technology might become standard but not today.

So for me personally the winner will be raw performance per $ for my new build.

Gimmicks are fine as long as they don't charge me premium kidney for it.
 
  • Rep+
Reactions: LazyGamer
#40 ·
  • Haha
Reactions: LazyGamer
#42 · (Edited)
a little concerned here that it's only 2x 8pin
didnt they say just over 50% per watt improvement?
that doesnt add up to 100% raster increase
each 8 pin can deliver upto 300w safely, given an AWG16 wire gauge for the connector is used.

I have seen my 3090 draw around 250-260w each pin with the KPE 1000W VBios flashed on my 2x8pin 3090.
 
#44 ·
even if it CAN take that much power, the official rating is 150watt per 8pin and its doubtful AMD will go over that. so your looking at a 375watt card max with possibly AIBs going to 400
usually they like to cap pcie draw at 50watts as well, AMD took huge backlash in the past with cards drawing over 75 from pcie and issues with cheap motherboards

I'm just going by what AMD said, not random rurmors. If over 50% per watt; lets give them the benefit of doubt and say 60%.
no way this can double the last gen card in raster
 
#48 ·
50% vs 6900XT.
so, a half point uplift per watt the 7900XT vs the 6900XT.
How is that not possible, when its actually down to how efficient the arch is, vs how much wattage are you pumping into it.
 
#50 · (Edited)
Now im thinking this card will come in at 15-20% faster than the 4080 but still materially slower than the 4090

so lets wait for pricing. If they can match the $1200 msrp of the 4080 they have a real winner. slower raytracing but faster raster at same price
If its priced at a premium above the 4080, it will still sell out for a few months but will probably struggle next year and will give no incentive for nvidia to drop prices.
Which AMD might actually want as they also have an inventory problem with last gen cards

This is all speculation obviously, but excited for the reveal. Hope they are not vague the same way nvidia was during their reveal
 
#52 ·
You'll need to see the benches side by side. I don't give a rat's patoot about how much less power the 7000 will draw. All the counts for me is price, FPS, smooth, and no gotchas. I'm leaning back toward AMD on this one.
 
#56 ·
So excited for 3rd November, can't wait.

I am little worried that we won't see cards on maret till December as I wanted to get new build before end of Nov but oh well, I waited years with upgrade, can wait few more weeks.

Give me that 4080 equivalent for much lower price AMD and pre-order is coming in hot like melting cables of 4090.
 
  • Rep+
Reactions: Nono31 and .667270
#59 ·
Moores law is dead says that RDNA can up by almost 50% increase in power limits which would be around 550 watts.Refernces designs seem to limited & will be two 8 pins, so 150+150+75= 375 watts. A.I.B could use three 8pin so 450+75= 525 watts. I'm betting that 4.0ghz main clock was real without exotic cooling. OIr there is a Top model AMD is holding back till the 4090 ti gets released.