Overclock.net banner
521 - 540 of 1,702 Posts
ahahah i need rgb fan, i will change the orientation of the rad once i got 140 rgb fans
u know better whats is good for you, this is good info and perfectly visual explained, just sharing good stuff and any1 does what ever he feels like it doing. ( for me it helped twice, once with the noise, secound time it was no noise but i assumed it was air bubbules at the cpu part so i tilted and wiggled the whole pc case mutiple times, and the temp in load droped few degrees )
 
You are no. 1 https://www.3dmark.com/spy/17376019

You are the one who took my 1st spot.
I can beat you if I want to btw
Hi!

It was me :)

Sapphire Nitro+ 6800 XT OC card

2474950


The way it could happen was that somehow with 2690-2790MHz range the card did not boost to 2790MHz, it stayed around 2735MHz and even if it crashed at the 3dmark demo, the graphics test 1-2 could finish from like 10times once.
If there will be a voltage/frequency curve editor + bios flash possibility to get to 1175mV (6900XT bios), it could be stabilized well above 2750MHz.
Currently I tested Metro Exodus (picturce below is a 20min run), I got a crash after 40mins, GPU clock continuesly around 2740-2760MHz (2700-2800MHz wattman setting). "Problem is", the power consumption for whole PC (without monitor) is around 620W measured at the wall... (with i7-8700K 1.245V @4.7GHz)
3DMark Time Spy max. I saw was 689W. I cannt go further, my PSU is only 750W. The sweet spot would be around 1.05V and 2600MHz average :)
2474951


2474953
2474955
 
Congrats (y)

What did you set to MPT for gpu power? 349w only for gpu too much for 2x8pin card?
My settings were crap that time, but here are the print screens.
2474956
2474957


Honestly, if you check HWINFO64, card never giving more that 300A, so this setting is useless.
TDC Limit SoC is also useless I think, I never saw 25A or more if I remember correctly.
The only thing you should change is the Power Limit GPU and hope that with Wattman settings you dont get boosted to your max. range and you stay at average.

If you change Maximum SoC voltage lets say to 1050mV, you get around 2680MHz average crash. The more, the stable, but I dont think it can go higher.
Same goes for Maximum Voltage GFX, 6900XT owners can have 1175mV setting, but if you do it on the 6800XT, you are going to have a real maximum of 1018mV (GPU-Z or HWINFO64 writing it correctly). Same story for non-xt 6800 owners, they can go only to 1050mV if I am correct (but from 1025mV!)

I just did a 6900 XT sapphire nitro+ bios flash to my 6800 XT, its not working with v3.15 amdvbflash, card still recognized as 6800 XT and no display output. (I see all this booted with my iGPU UHD 630)
I flashed back to original bios, but still no display output, had to switch to 2. bios on the card. Just saving you the headache :)
 
btw use HWinfo64 for more accurate sensor readings
I bought on Steam the 3DMark, so now I could skip the demo part, get my nickname next to the score, and now its only Graphics Test 1-2 + CPU Test. It's Time Spy.

I use HWINFO64 for cross-check. What I need, is better visible on GPU-Z (GPU-CLOCK Diagramm).
I saw same readings at both programs.

The point increase was probably possible, because I had an external wifi-card, so I had only pci-e 3.0 x8 lane for the graphics card, and now I have only the GPU, so it has pci-e 3.0 x16 lane :)
There isn't any extra reason, why I had more points, especially if I check the gpu-clock rate, this time it was -10MHz for sure!
 
This is why i will never touch AMD card again. You never know it's works as it intended.

AMD is a waste of time and no one have to deal with their buggy software or cards and the limitations.

On the contrary nvidia is perfect. If you are boost to 2200mhz you will get 2200mhz performance. There is no bug. There is not more than 0.01% performance difference. İt's %10 difference for amd which is really shame.
 
This is why i will never touch AMD card again. You never know it's works as it intended.

AMD is a waste of time and no one have to deal with their buggy software or cards and the limitations.

On the contrary nvidia is perfect. If you are boost to 2200mhz you will get 2200mhz performance. There is no bug. There is not more than 0.01% performance difference. İt's %10 difference for amd which is really shame.
What are you on about? The downclocking is completely normal; it's just P-states doing their thing.

The issue I personally had is a bug, but is easy to workaround now, and will be fixed in the future. Trust me, Nvidia's drivers have -not- been bug free since their launch.
 
What are you on about? The downclocking is completely normal; it's just P-states doing their thing.

The issue I personally had is a bug, but is easy to workaround now, and will be fixed in the future. Trust me, Nvidia's drivers have -not- been bug free since their launch.
You can get at 2750mhz 90 fps or 105fps with amd card with same system. There shouldn't be that wide margin for same gpu frequency with same system and power limits.

If you are lucky enough you will get 106fps. If you are not you will get 89fps.

For nvidia it's works more than perfect. with 2200mhz and 3080 everbody will get "about the same" performance.

For amd you have to deal with their meaningless limitations, powers, sensor readings ect. That shouldn't be a rocket science.
 
You can get at 2750mhz 90 fps or 105fps with amd card with same system. There shouldn't be that wide margin for same gpu frequency with same system and power limits.

If you are lucky enough you will get 106fps. If you are not you will get 89fps.

For nvidia it's works more than perfect. with 2200mhz and 3080 everbody will get "about the same" performance.

For amd you have to deal with their meaningless limitations, powers, sensor readings ect. That shouldn't be a rocket science.
this is not rocket science, this is dictatorship, this is totalitarism, this is stupidity for people that will accept anything if they will make them happy, what i wrote first posts that i deleted was about this thing, but who cares or knows or even understands what i am talking about ? rofl.
 
You can get at 2750mhz 90 fps or 105fps with amd card with same system. There shouldn't be that wide margin for same gpu frequency with same system and power limits.

If you are lucky enough you will get 106fps. If you are not you will get 89fps.

For nvidia it's works more than perfect. with 2200mhz and 3080 everbody will get "about the same" performance.

For amd you have to deal with their meaningless limitations, powers, sensor readings ect. That shouldn't be a rocket science.
I don't see an issue with performance consistency here, nor have any major reviews of the 6000 series reported this. If your personal experience differs, then return the card and go Nvidia. Smack-talking AMD over minutiae that people overall don't experience is... yelling into the æther. Reddit is the place for that.
 
I don't see an issue with performance consistency here, nor have any major reviews of the 6000 series reported this. If your personal experience differs, then return the card and go Nvidia. Smack-talking AMD over minutiae that people overall don't experience is... yelling into the æther. Reddit is the place for that.

Here you can see while same profile already active it's 91 fps and applying same profile it's 99 fps?? And the frequecy drops from 2750mhz to 2700mhz with %10 increase performance?

There are too much issues with amd. It's not reddit kind. it's limits, driver, power stages issue which should be speak in here. You will never know you are getting expected performance from your power draw or frequency.

By the way why we have to set 100mhz difference btw min and max freq? Their conservative behavior starts from here.
 

Here you can see while same profile already active it's 91 fps and applying same profile it's 99 fps?? And the frequecy drops from 2750mhz to 2700mhz with %10 increase performance?

There are too much issues with amd. It's not reddit kind. it's limits, driver, power stages issue which should be speak in here. You will never know you are getting expected performance from your power draw or frequency.
no talks about it so is not real, its fake news, bcuz ppl dont talk about it.
 

Here you can see while same profile already active it's 91 fps and applying same profile it's 99 fps?? And the frequecy drops from 2750mhz to 2700mhz with %10 increase performance?

There are too much issues with amd. It's not reddit kind. it's limits, driver, power stages issue which should be speak in here. You will never know you are getting expected performance from your power draw or frequency.

By the way why we have to set 100mhz difference btw min and max freq? Their conservative behavior starts from here.
So you're arguing a point, generalizing the entire generation of GPU's.... from a single, non-scientific, un-repeated performance anomaly. Got it.
 
So you're arguing a point, generalizing the entire generation of GPU's.... from a single, non-scientific, un-repeated performance anomaly. Got it.
No it's always like that for everbody. They just did not realize that because if you are not overclocking it more than a certain point it will work. But after a certain point it will go crazy like my video. And you can be sure it's not based on a single gpu (y)

The only sofware for oc it not working properly. Their sliders, frequecies or power limits just not makes sense. Why we have to choose 100mhz difference btw min and max? Did you ever think about it? For nvidia there are only one slider which works perfect. If you dig more than that you can get voltage vs freq graphs easly.

For amd people still looking for flashing, bios editing, power stage changing, reverse engineering ect. This is shall not be happen for anybody since there are nvidia is an option and this is a waste of resources of world.
 
No it's always like that for everbody. They just did not realize that because if you are not overclocking it more than a certain point it will work. But after a certain point it will go crazy like my video. And you can be sure it's not based on a single gpu (y)

The only sofware for oc it not working properly. Their sliders, frequecies or power limits just not makes sense. Why we have to choose 100mhz difference btw min and max? Did you ever think about it? For nvidia there are only one slider which works perfect. If you dig more than that you can get voltage vs freq graphs easly.

For amd people still looking for flashing, bios editing, power stage changing, reverse engineering ect. This is shall not be happen for anybody since there are nvidia is an option and this is a waste of resources of world.
If you've ever had experience seriously overclocking an NVidia GPU and all this doesn't sound familiar, then I don't know what to say.

What you are saying about simplicity was the case back when the 290x and 780ti were the hot stuff out there. You set a clock, set a voltage and that was it. Ever since then, it's been a crapshoot with all the different algorithms for both companies. The big difference is that NVidia doesn't let you control basically anything apart from the clock. Try upping the voltage on NVidia without bios mods, good luck with that.

As for the inconsistent performance, I beg to disagree. No matter the GPU, there's a million things that can cause a system to have inconsistent performance from time to time. But when I control all variables I can, my 5700 XT performs exactly the same every time, all the time. RDNA2 is a new architecture and we may be seeing some driver bugs here and there, but nothing major like when RDNA launched and it was a mess or when NVidia released a driver a couple of years ago that was actually bricking perfectly working cards.

And lastly, concerning that comment about the min and max clock sliders being 100MHz apart, that's to minimize clock fluctuations under load so that the clocks stay more consistent. People do that on NVidia too by editing the frequency curve, nothing new here.

TL;DR : You are complaining about how complex complex overclocking is on AMD and compare it to how simple simple overclocking is on NVidia. Compare apples to apples. Complex vs complex and simple vs simple. Ask the guys that bios mod their cards to get more voltage on NVidia if there's anything user friendly about that. Try just moving the sliders on AMD without messing with other parameters. And more importantly, have fun with your toys, game on them, don't just complain on the internet.
 
  • Rep+
Reactions: CS9K and newls1
521 - 540 of 1,702 Posts