Overclock.net banner
16541 - 16560 of 18211 Posts

·
Registered
Joined
·
1,260 Posts
....fyi, Igor's lab has an interesting comparison of thermal pads with different W/mK ratings on GDDR6X ...

source

I didn’t see this before. Thank you for sharing!

I almost wish I would have waited before installing my block. I was spending way too long researching thermal pads, and thickness etc. I literally was just like F’it i'm puttin it on anyways with what’s included.
 

·
Registered
Joined
·
1,260 Posts
1000w works fine for my zotac 2x8pin
no rbar, allows to hold high OC on core and mem on all games for the extra 3-5fps if you need that
Well, it isn’t always 3-5 fps. (But I get what your saying though) Although, not all “Stock” 3090’s are made equal.

For example my last bone stock 2080Ti Founders Edition on stock air cooling burning through a heavy load in port royal manages around 36 fps. Well after watercooling, flashing the bios, and soldering on the 8 ohm resistors and overclocking the memory and core as much as possible that 36 fps average goes to 48 fps average. (That’s huge) Or instead of 60 fps it’s now 80 fps. Plenty of game show even more than that. I’ve seen as high as 38% with in game benchmarks.

I needed the extra 35% performance in games. It became a standard for me lol.

My main point is that, while any 3090 is fast no matter how bad or slow it may be. There are plenty of 3090’s out there that can cough up a whole lot more than 3-5fps.
 

·
The IT guy...
Joined
·
477 Posts
If anyone wants 15% of GELID pads (I'd go for the Extreme, they are really soft and compress much better than the Ultimate and Thermalright pads), use this code in the official Gelid store.

FRIEND-8JQVPBX

Any idea what thickness I'll need for 3090FE with a Bykski block and active backplate?
 

·
Registered
Joined
·
72 Posts
EDIT: I just scrolled more through the Steam post and it referred to CRU, which is what I mentioned I was using before. That's what's crashing my computer when I try using it on the 3090. More info about that in the rest of this...

that's not the issue I have. It's that Windows, both in Display Settings and NVidia Control Panel, under resolutions, it shows 3840x2160 native, but above that, 4096x2160. Even though it shows that 3840x2160 is native, it keeps switching on the other one, especially any time I use NVidia Surround mode with the three displays. It's been a known problem for years, and Custom Resolution Utility has been around for a long time. It's actually a really fantastic utility that lets you access and tweak every property that has to do with the display. In the past, it's most popularly known for overclocking monitors and laptop displays. But it covers everything, including audio formats for ARC or eARC, VRR, etc. I've been using it for years with my 1080tis to get rid of that extra resolution. Not really sure why I was having problems with it last night. I tried it like 5 times and it just froze the computer. Been having a really, REALLY rough week with a lot of personal problems that I definitely didn't have the patience to keep trying to trouble shoot it, or even get into trying to overclock and/or flash the card last night. I was able to get Middle Earth: Shadows of War to run at 11,520x2160 with the HD res pack and everything turned up except blur and depth of field, and maintained between the 48-60fps that the TVs should be covering with G-Sync compatible. So I played that and relaxed before crashing. Definitely happy with the card. That game was using over 10GB of VRAM, but it's also a few years old. I know other newer games will use even more VRAM at that resolution, so that's why I went for the 3090 instead of the 3080ti.
 

·
Registered
Joined
·
12 Posts
EDIT: I just scrolled more through the Steam post and it referred to CRU, which is what I mentioned I was using before. That's what's crashing my computer when I try using it on the 3090. More info about that in the rest of this...

that's not the issue I have. It's that Windows, both in Display Settings and NVidia Control Panel, under resolutions, it shows 3840x2160 native, but above that, 4096x2160. Even though it shows that 3840x2160 is native, it keeps switching on the other one, especially any time I use NVidia Surround mode with the three displays. It's been a known problem for years, and Custom Resolution Utility has been around for a long time. It's actually a really fantastic utility that lets you access and tweak every property that has to do with the display. In the past, it's most popularly known for overclocking monitors and laptop displays. But it covers everything, including audio formats for ARC or eARC, VRR, etc. I've been using it for years with my 1080tis to get rid of that extra resolution. Not really sure why I was having problems with it last night. I tried it like 5 times and it just froze the computer. Been having a really, REALLY rough week with a lot of personal problems that I definitely didn't have the patience to keep trying to trouble shoot it, or even get into trying to overclock and/or flash the card last night. I was able to get Middle Earth: Shadows of War to run at 11,520x2160 with the HD res pack and everything turned up except blur and depth of field, and maintained between the 48-60fps that the TVs should be covering with G-Sync compatible. So I played that and relaxed before crashing. Definitely happy with the card. That game was using over 10GB of VRAM, but it's also a few years old. I know other newer games will use even more VRAM at that resolution, so that's why I went for the 3090 instead of the 3080ti.
Yeah sorry I didn’t read your post properly
 

·
Invalid Media
Joined
·
4,787 Posts
Would love to see comparison between the stock Alphacool pads which are supplied on their blocks and Bykski or even EKWB

As well he is comparing Alphacool pads, how hard for him to compare against Thermalright or Gelid pads or Fujipoly pads

Alphacool pads are just overpriced and won't touch Alphacool pads, fittings etc One thing which is okay is radiators


Hope this helps

Thanks, Jura
Well ...the source article I linked above talked about that:

2518516
 

·
Registered
Joined
·
640 Posts
Well, it isn’t always 3-5 fps. (But I get what your saying though) Although, not all “Stock” 3090’s are made equal.

For example my last bone stock 2080Ti Founders Edition on stock air cooling burning through a heavy load in port royal manages around 36 fps. Well after watercooling, flashing the bios, and soldering on the 8 ohm resistors and overclocking the memory and core as much as possible that 36 fps average goes to 48 fps average. (That’s huge) Or instead of 60 fps it’s now 80 fps. Plenty of game show even more than that. I’ve seen as high as 38% with in game benchmarks.

I needed the extra 35% performance in games. It became a standard for me lol.

My main point is that, while any 3090 is fast no matter how bad or slow it may be. There are plenty of 3090’s out there that can cough up a whole lot more than 3-5fps.
Sure you can get more than 5fps boost. Quake RTX would the game that shows 10fps+ gains.
The problem will be, that it takes 700w+ to get there.

If you take new lego builders or marble rtx demo well those doesn't scale past 2115core or mem oc, 0 fps gains.

You also have new Metro, where 2115core requires alot of core voltage, so you won't scale because you'll hit 1.1v early.

I tested a few games and between 2115core +1000mem vs 2190 +1500mem is at most 3fps more. If I push for more 2220core +1560mem, maybe I see +0.5fps😂

My general observation is 1fps for every +66mhz core and another 1fps for every +500mem. 🙄

You also have to consider the weird power behaviour we get on rtx3090. I have 66fps lock, voltage limit to 1v and game menus can spike power to 500w when the game itself doesn't even use 300w. So something to consider if you want to keep the card for a long time at high OC.
 

·
The IT guy...
Joined
·
477 Posts
Hi there

I would suggest get 1.5mm thermal pads from Gelid or Thermalright

Bykski using 1.2mm thermal pads on their blocks what I know and I measured them as well

Hope this helps

Thanks, Jura
Thanks for the info!

Have anyone tried those EC360® Silver 12W/mK Thermal Pads?
 

·
Registered
Joined
·
28 Posts
Do we have any kind of consensus on how to identify "bad" 3090 FTW3 Ultras, and what's causing it?

I've seen some reports that the pull on the PCIe slot might be an indicator, as it shouldn't exceed the spec (75W). My card pulls a max of 80.2 W from the slot during Time Spy Extreme.

This is my second FTW3. First one died all of a sudden, and was no longer detected. This was at the beginning of the year.
 

·
Registered
Joined
·
640 Posts
Do we have any kind of consensus on how to identify "bad" 3090 FTW3 Ultras, and what's causing it?

I've seen some reports that the pull on the PCIe slot might be an indicator, as it shouldn't exceed the spec (75W). My card pulls a max of 80.2 W from the slot during Time Spy Extreme.

This is my second FTW3. First one died all of a sudden, and was no longer detected. This was at the beginning of the year.
I've seen post where the new FTW3 had 45w pcie power and able to use 500w+ for board power.

Not sure if even those new FTW3 boards are also replaced due to new issues with weird games frying them ( that Amazon game).
 

·
Facepalm
Joined
·
10,225 Posts
Do we have any kind of consensus on how to identify "bad" 3090 FTW3 Ultras, and what's causing it?

I've seen some reports that the pull on the PCIe slot might be an indicator, as it shouldn't exceed the spec (75W). My card pulls a max of 80.2 W from the slot during Time Spy Extreme.

This is my second FTW3. First one died all of a sudden, and was no longer detected. This was at the beginning of the year.
All bad 3090 FTW3's have a ver 0.1 stamped next to the PCIE slot.
Or rather all potentially bad ones.
All of the good ones are rev 1.0.
The good ones have a digital VRM controller. The bad one have an analog controller.

This also apparently applies to 3080's as well.
 
  • Rep+
Reactions: mol4711

·
Registered
Joined
·
13 Posts
Hi, has any one flashed this bios (Zotac RTX 3090 VBIOS) on Zotac trinity?
This card is 3 pin and my trinity is two 8 pins. I wanna know if flashing this bios will increase the power limit? I tried the XOC bios but it had some problems with one of the software I use (3D max) so I flashed the original zotac trinity bios.
 

·
Registered
Joined
·
640 Posts
All bad 3090 FTW3's have a ver 0.1 stamped next to the PCIE slot.
Or rather all potentially bad ones.
All of the good ones are rev 1.0.
The good ones have a digital VRM controller. The bad one have an analog controller.

This also apparently applies to 3080's as well.
There's alot of ref 3090 (not evga) with analogue controllers.
I'm assuming this is not a sign that all analogue controller designs are bad ?
 

·
Facepalm
Joined
·
10,225 Posts
There's alot of ref 3090 (not evga) with analogue controllers.
I'm assuming this is not a sign that all analogue controller designs are bad ?
I'm talking about eVGA 3080 and 3090. Not other AIB's.
 

·
Registered
Joined
·
624 Posts
Hi, has any one flashed this bios (Zotac RTX 3090 VBIOS) on Zotac trinity?
This card is 3 pin and my trinity is two 8 pins. I wanna know if flashing this bios will increase the power limit? I tried the XOC bios but it had some problems with one of the software I use (3D max) so I flashed the original zotac trinity bios.
Nope, won't give you a higher PL. Probably lower. Only 2x8pin upgrades are the 390w Gigabyte/Galax/etc... Or the KP XOC 1kw.
 

·
Registered
Joined
·
1,260 Posts
Hey everyone I ordered some Fujipoly 1.5MM and, 0.5MM 17kwm thermal pads.

Is it ok to stack these pads to make 2MM thermal pads?

I am gonna swap the thermal pads on my Kingpin Hydro Copper.

I’m after the best pads possible. So just curious to hear inputs on this. Thank you everyone!
 
16541 - 16560 of 18211 Posts
Top