Overclock.net banner
16521 - 16540 of 18181 Posts

·
Registered
Joined
·
397 Posts
Yeah I reckon this confirms they’re noting factory tested overclocks.

Out of interest what’s your “effective clock” with that curve, using HWInfo? I’ve found setting a curve like that Afterburner misrepresents the real (aka effective) clock speed I’m achieving by as much as 100-150mhz.
For v/f OC like the one in the photo for benching, you'd need to add NVVDD/MSVDD voltages to increase internal clocks. It's important to test requested clocks because they're the ones capable of crashing. Internal clocks only add performance. For daily use, normal OC offset + dip switches seems to be the best option.
 

·
Registered
Joined
·
1,260 Posts
Interesting. My KP has same marks. 2 dots on the top left corner. "1" on the top right corner and "1995" on lower right corner.

Could you test your chip with ATI Tool? Make sure to max out the voltage slider and idle temps 22-23c before opening the v/f curve. 1.10v should be 2055-2070MHz at stock. Then add 270-285 to that voltage point and click run. It should hold 2325-2340MHz at 30-31C.
View attachment 2518439
I will try later today if I get some more free time.

1995Mhz does make sense. I know that if I run port royal stress test, my frequency starts out as 1,995Mhz after a few minutes, the card hits 2,010Mhz and then it goes to 2,025Mhz and locks in on 2,025Mhz.


No overclocking, or additional voltages.


But yes I will try that. Let me get my temps in check. I just got it on my loop.





 

·
Gamer and overclocker !
Joined
·
2,417 Posts

·
Registered
Joined
·
1,260 Posts
The PR score :)
Coming soon. Unfortunately I am working right now. Staring at a 3090KP HC while logged in to a super slow remote system for the next 6-7 hours. So no port royal yet unfortunately.
 

·
Registered
Joined
·
1,260 Posts
Just wanted to say that my memory temps are ludicrous! Hottest I saw was about 55C on the back side. And my backplate is cooking like never before! It hits right at about the same temp as the memory does. Too hot to touch thats for sure. I imagine with a fan blowing on it, I may be able to get my back GDDR6X modules in the very low 50’s or under 50 Now I see why people install heatsinks and fans on these back plates now. It looked kinda silly at first seeing picture in the forums of people do that, but on my Hybrid the backplate just didn’t get nearly this hot.

I am only using the pre included thermal pads with the KP hydro copper kit. They work amazing. I cleaned the crap out of the mem modules with alcohol. And laid all the pads down just so.

My memory was hitting 75C on the back before the hydro copper block. So going down to 55 from that is huge!
 
  • Rep+
Reactions: GRABibus

·
Registered
Joined
·
637 Posts
Just wanted to say that my memory temps are ludicrous! Hottest I saw was about 55C on the back side. And my backplate is cooking like never before! It hits right at about the same temp as the memory does. Too hot to touch thats for sure. I imagine with a fan blowing on it, I may be able to get my back GDDR6X modules in the very low 50’s or under 50 Now I see why people install heatsinks and fans on these back plates now. It looked kinda silly at first seeing picture in the forums of people do that, but on my Hybrid the backplate just didn’t get nearly this hot.

I am only using the pre included thermal pads with the KP hydro copper kit. They work amazing. I cleaned the crap out of the mem modules with alcohol. And laid all the pads down just so.

My memory was hitting 75C on the back before the hydro copper block. So going down to 55 from that is huge!
test mining benchmark or quake rtx, it's usually 10c more vs regular usage if the contact is good

I'm at 64c for the back mem, heatsink + fan on my EK backplate
 

·
Registered
Joined
·
1,260 Posts
test mining benchmark or quake rtx, it's usually 10c more vs regular usage if the contact is good

I'm at 64c for the back mem, heatsink + fan on my EK backplate
I had about 30 minutes of time to spare before work today. So I ran the port royal stress test. That’s when I saw the back modules were 55c. Never tried Quake RTX before.


I logged to a gpu-z file before and after with the hybrid to the hydro copper. I haven’t looked thoroughly at the results yet. But I am interested for sure. The memory before and after temps were the most incredible of anything.
 

·
Registered
Joined
·
305 Posts
It was bad or defective solder. More precisely, the VRAM chip next to the PCIE slot that was losing its connection (which is why when the card cooled down, e.g. to be shipped out, it suddenly worked again).
That same chip is the hottest VRAM chip on Ampere cards.
...no problem accepting that, though 'bad solder' is still a QC issue at the end of the day. However, I'm steering more towards the 'expectations' of a consumer acting on imperfect information, and this latest Amazon World game beta issue doesn't help
Well, i never came back to check what was really the issue, thank you for pointing out.

At the time, Charlie from hardforum was following the matter, but he didn't disclose what was the issue, he later on joined Intel, then left.
And we were arguing, at the time, about either the memory controller or the memory chips themselves.
I was wrong it seems, i bet on the gpu memory controller.

Tho, it does not surprise me, if this issue arisen from bad QA/QC on soldering.
Nvidia soldering job is pretty awful, leaving empty exposed copper pads, to cheap on solder quantity and quality, which is not great on a 2k$ card. 🙆‍♀️
(purely and simply, from an engineering and manufacturing point of view)
 

·
Facepalm
Joined
·
10,214 Posts
Well, i never came back to check what was really the issue, thank you for pointing out.

At the time, Charlie from hardforum was following the matter, but he didn't disclose what was the issue, he later on joined Intel, then left.
And we were arguing, at the time, about either the memory controller or the memory chips themselves.
I was wrong it seems, i bet on the gpu memory controller.

Tho, it does not surprise me, if this issue arisen from bad QA/QC on soldering.
Nvidia soldering job is pretty awful, leaving empty exposed copper pads, to cheap on solder quantity and quality, which is not great on a 2k$ card. 🙆‍♀️
(purely and simply, from an engineering and manufacturing point of view)
It was a German solderer on Elmor's discord who found out. It happened on his 2080 card and he ran a (linux or some inline i2c) test which showed memory errors at a certain location and then he checked the soldering and saw the problem on one VRAM chip. I think his name is oldirdey. He has a 2080 ti shunt mod soldering video on youtube.
 
  • Rep+
Reactions: 1devomer

·
Registered
Joined
·
209 Posts

·
Registered
Joined
·
1,340 Posts
Anyone tried the new Amazon MMO New World with your 3090 FE card? I'm scared to play it and wonder if they have fixed the issue yet
 

·
Registered
Joined
·
72 Posts
So, I was finally able to sell one of my kidneys and pick up a 3090 today. Initially I ran over to pick up the MSI RTX 3090 Suprim X, but I noticed they had a Gigabyte Xtreme there as well. I have a rather rare/unique gaming setup. I run three 55" Samsung RU8000 4K TVs in surround mode. These are the TVs from 2019 that came with Freesync before Samsung ditched it for the last two generations, but I had never been able to test it, and I couldn't find anyone that had tested the new 30 Series (or even proper testing with the 20 series) cards and the Samsung RU8000 with VRR. Before this, I was running a pair of GTX1080ti's in SLI. Anyway, the Gigabyte is the only one that has some cards that have 3 HDMI ports, so as much as I hate Gigabyte (good products, horrible, HORRIBLE customer support/service), I decided to try it anyway in hopes that the HDMI would work (I bought three 3D Club DP1.4 to HDMI2.0b from Amazon a few days ago, but it doesn't support VRR). Got it home, hooked it up, tinkered around with it, crashed my computer repeatedly trying to use CRU (Custom Resolution Utility) to get rid of that stupid extra 4K resolution that TVs support, but none display, before saying F*** it and just work around it. What I did notice was that when I turn on Game Mode and FreeSync Ultimate on the TVs, NVidia Control Panel does recognize them as G-Sync compatible displays, including when I have them in surround mode.
So, gonna get around to trying to O/C it later on. But I have been confused about the custom ROMs on the first page. On the list, under Power Limit, for mine it says 420/450. What does that mean? It's a 420 card, but with the ROM it goes to 450?
Also, I asked before, but didn't get an answer. Do the new ROMs have BAR support? Or is that separate that I have to flash?
 
  • Rep+
Reactions: Falkentyne

·
Registered
Joined
·
1,260 Posts
So, I was finally able to sell one of my kidneys and pick up a 3090 today. Initially I ran over to pick up the MSI RTX 3090 Suprim X, but I noticed they had a Gigabyte Xtreme there as well. I have a rather rare/unique gaming setup. I run three 55" Samsung RU8000 4K TVs in surround mode. These are the TVs from 2019 that came with Freesync before Samsung ditched it for the last two generations, but I had never been able to test it, and I couldn't find anyone that had tested the new 30 Series (or even proper testing with the 20 series) cards and the Samsung RU8000 with VRR. Before this, I was running a pair of GTX1080ti's in SLI. Anyway, the Gigabyte is the only one that has some cards that have 3 HDMI ports, so as much as I hate Gigabyte (good products, horrible, HORRIBLE customer support/service), I decided to try it anyway in hopes that the HDMI would work (I bought three 3D Club DP1.4 to HDMI2.0b from Amazon a few days ago, but it doesn't support VRR). Got it home, hooked it up, tinkered around with it, crashed my computer repeatedly trying to use CRU (Custom Resolution Utility) to get rid of that stupid extra 4K resolution that TVs support, but none display, before saying F*** it and just work around it. What I did notice was that when I turn on Game Mode and FreeSync Ultimate on the TVs, NVidia Control Panel does recognize them as G-Sync compatible displays, including when I have them in surround mode.
So, gonna get around to trying to O/C it later on. But I have been confused about the custom ROMs on the first page. On the list, under Power Limit, for mine it says 420/450. What does that mean? It's a 420 card, but with the ROM it goes to 450?
Also, I asked before, but didn't get an answer. Do the new ROMs have BAR support? Or is that separate that I have to flash?

The default power limit is 420 watts. And with overclocking programs like PX1, or MSI Afterburner you can increase that power limit to 450 watts.

So 420 watts would be (100% or default). And you have a little power slider in most overclocking utilities that allows you to increase the power limit. So 450 watts would be around probably 107%-108%.
 

·
Registered
Joined
·
72 Posts
If anyone wants 15% of GELID pads (I'd go for the Extreme, they are really soft and compress much better than the Ultimate and Thermalright pads), use this code in the official Gelid store.

FRIEND-8JQVPBX

awesome. I was checking these out on Amazon and Newegg earlier today. They make a big difference?
 

·
Registered
Joined
·
72 Posts
The default power limit is 420 watts. And with overclocking programs like PX1, or MSI Afterburner you can increase that power limit to 450 watts.

So 420 watts would be (100% or default). And you have a little power slider in most overclocking utilities that allows you to increase the power limit. So 450 watts would be around probably 107%-108%.
ah, gotcha. Awesome, thanks for the info.
Do people get much of a boost going to the 1000? over 800 pages to go through, lol.
 

·
SAY AMBIENT AGAIN! SAY IT
Joined
·
304 Posts
Do people get much of a boost going to the 1000?
I don't have any experience with the 1kw bios options but I am running the KPE 520w Rebar bios with a lot of success.

15595 PR

23191 Graphics Score Time Spy

 

·
Registered
Joined
·
1,260 Posts
I don't have any experience with the 1kw bios options but I am running the KPE 520w Rebar bios with a lot of success.

15595 PR

23191 Graphics Score Time Spy

Yeah I run the same bios on my KP. I just moved over to the normal bios for gaming though. My 3090KP is running pretty cool on the hydro copper with the normal bios. (No need to OC with such a card for gaming lol) 2025 is plenty

Very fast! I love it!
 

·
Registered
Joined
·
2,542 Posts
....fyi, Igor's lab has an interesting comparison of thermal pads with different W/mK ratings on GDDR6X ...

source

Would love to see comparison between the stock Alphacool pads which are supplied on their blocks and Bykski or even EKWB

As well he is comparing Alphacool pads, how hard for him to compare against Thermalright or Gelid pads or Fujipoly pads

Alphacool pads are just overpriced and won't touch Alphacool pads, fittings etc One thing which is okay is radiators

Hope this helps

Thanks, Jura
 
16521 - 16540 of 18181 Posts
Top