Overclock.net › Forums › Graphics Cards › AMD/ATI › [Official] MSI R9 290X Lightning Thread
New Posts  All Forums:Forum Nav:

[Official] MSI R9 290X Lightning Thread - Page 217

post #2161 of 2769
Quote:
Originally Posted by sammarbella View Post

Quote:
Originally Posted by rt123 View Post

Wait is 1571 your core Voltage.?

Quote:
Originally Posted by tsm106 View Post

Looks like memory and core is reversed. It makes sense of you flip them.

I take measurement following v-check points of official MSi pic:



I took Core (GPU) measurement on the v-check point more separated from 8 pin connectors connectors (the last one).

So the the error is on the pic ""%$·"·! mad.gif

This pic can't even been from a Msi 290x Lightning PCB cause it has only 2 pin connectors and our GPUS have 2x8 and 1x6!

Ya the last two should be flipped in the image.
Its also written on the back of the PCB, that which is which. But I am not sure if you can see that with the backplate on.
post #2162 of 2769
I ran mes
Quote:
Originally Posted by rt123 View Post

Ya the last two should be flipped in the image.
Its also written on the back of the PCB, that which is which. But I am not sure if you can see that with the backplate on.

With the backplate on i see...the backplate! biggrin.gif

So after correcting this how are my voltaje lectures?

I will post tomorrow +200/+300 lectures on core.
post #2163 of 2769
4K VSR resolution added to 290X GPUs in 15.3 drivers! (MOD)

I posted a how-to here:

http://www.overclock.net/t/1547184/amd-catalyst-15-3-beta-driver-for-windows-os/310#post_23722424




4K VSR works on desktop with or without crossfire.

4K VSR works fine on games only with crossfire disabled if it's enabled the right side of the screen is constantly flickering.

It's possible to game with 4K VSR and crossfire on globally only if a specific profile for the game has crossfire disabled.
Edited by sammarbella - 3/27/15 at 9:03am
post #2164 of 2769
4K VSR finally fully working including games with crossfire:

http://forums.guru3d.com/showpost.php?p=5039386&postcount=366

smile.gif
post #2165 of 2769
Quote:
Originally Posted by tsm106 View Post


That said on topic, move not only your memory clock but your core clocks around. Running flat out is not always the fastest. That's the key to remember.

Want to hear something weird? I got my highest score on 3dmark the other day. Went back to my other results just to compare the proportional gain, and the clock values on my highest score run said I was running at stock speeds, not at the real speeds I was running at. Do you know if there is a bug in 3dmark that doesnt accurately record your clocks?

Check it out:

http://www.3dmark.com/fs/4386463

That says I was running my gpu at stock speeds. I wasnt. processor clock was at 1190 and memory was at 1600. So, it scored accordingly but read my system info incorrectly. Oddly enough, I ran other tests just before that and they all correctly logged my clock values on the gpu.
post #2166 of 2769
I had that problem before but it miraculously disappeared after I reinstalled my GPU drivers.

Stock:
http://www.3dmark.com/fs/4048952

OC (same apparent frequencies, higher score):
http://www.3dmark.com/fs/4175220

Now fixed:
http://www.3dmark.com/fs/4354038
Sarcina Mk. II
(14 items)
 
Sarcina
(12 items)
 
 
CPUMotherboardGraphicsRAM
AMD Phenom II X4 955 BE MSI 870A-G54 EVGA GTX 460  2 x 2GB Patriot/Kingston 
CoolingOSMonitorKeyboard
XSPC Rasa RS240 Windows 7 (64 bit) Standard Dell Monitor KBC Poker II 
PowerCaseMouseAudio
XFX XXX Edition 650W CM Storm Scout Logitech MX 518 Audio Technica ATH-M50 
  hide details  
Reply
Sarcina Mk. II
(14 items)
 
Sarcina
(12 items)
 
 
CPUMotherboardGraphicsRAM
AMD Phenom II X4 955 BE MSI 870A-G54 EVGA GTX 460  2 x 2GB Patriot/Kingston 
CoolingOSMonitorKeyboard
XSPC Rasa RS240 Windows 7 (64 bit) Standard Dell Monitor KBC Poker II 
PowerCaseMouseAudio
XFX XXX Edition 650W CM Storm Scout Logitech MX 518 Audio Technica ATH-M50 
  hide details  
Reply
post #2167 of 2769
Quote:
Originally Posted by DMatthewStewart View Post

Quote:
Originally Posted by tsm106 View Post


That said on topic, move not only your memory clock but your core clocks around. Running flat out is not always the fastest. That's the key to remember.

Want to hear something weird? I got my highest score on 3dmark the other day. Went back to my other results just to compare the proportional gain, and the clock values on my highest score run said I was running at stock speeds, not at the real speeds I was running at. Do you know if there is a bug in 3dmark that doesnt accurately record your clocks?

Check it out:

http://www.3dmark.com/fs/4386463

That says I was running my gpu at stock speeds. I wasnt. processor clock was at 1190 and memory was at 1600. So, it scored accordingly but read my system info incorrectly. Oddly enough, I ran other tests just before that and they all correctly logged my clock values on the gpu.

Quote:
Originally Posted by LolCakeLazors View Post

I had that problem before but it miraculously disappeared after I reinstalled my GPU drivers.

Stock:
http://www.3dmark.com/fs/4048952

OC (same apparent frequencies, higher score):
http://www.3dmark.com/fs/4175220

Now fixed:
http://www.3dmark.com/fs/4354038


It's due to your settings config. If you want to show your clocks and you use AB, set windoze power settings to high performance and the disable the profiles within AB. This will force/hold the clocks you set. Then apply whatever clock setting in AB and it will not go idle. These clocks then will show when FSI checks your settings. The reason is doesn't always show is that the gpu goes into idle clocks while FSI is checking your settings. You'll also see this idle clock in your displayed cpu clock speed too.
The 3930
(26 items)
 
Junior's 3930
(22 items)
 
DATA/HTPC
(20 items)
 
  hide details  
Reply
The 3930
(26 items)
 
Junior's 3930
(22 items)
 
DATA/HTPC
(20 items)
 
  hide details  
Reply
post #2168 of 2769
Quote:
Originally Posted by tsm106 View Post


It's due to your settings config. If you want to show your clocks and you use AB, set windoze power settings to high performance and the disable the profiles within AB. This will force/hold the clocks you set. Then apply whatever clock setting in AB and it will not go idle. These clocks then will show when FSI checks your settings. The reason is doesn't always show is that the gpu goes into idle clocks while FSI is checking your settings. You'll also see this idle clock in your displayed cpu clock speed too.

I've had the Windows poer settings set to High Performance since I installed it. When you say to disable the profiles in AB, are you talking about the 5 quick set profiles that we manually save?

Also, would you recommend forcing constant voltage? Its one of the only settings that I constantly get two different answers for. I would assume that I wouldnt need to force the voltage because its going to use what it needs when it gets under load.

BTW, what is this Lite edition of this card? It seems to be exactly the same except for the core clock is only 1030. Are there other differences that Im missing?

--> http://www.amazon.com/MSI-R9-290X-LIGHTNING-NEW/dp/B00UGF7Y3K/ref=sr_1_2?ie=UTF8&qid=1427581905&sr=8-2&keywords=msi+r9+290x+lightning

PS: I shouldve bought 4 or 5 of my Lightnings when they were on sale. I bought two and sold one for a profit. Now I wish I kept it
post #2169 of 2769
I have the same problem low score on benchs when it must be higher.

Always on performance power mode and force constant voltage in AB.

?¿?
post #2170 of 2769
Quote:
Originally Posted by DMatthewStewart View Post

Quote:
Originally Posted by tsm106 View Post


It's due to your settings config. If you want to show your clocks and you use AB, set windoze power settings to high performance and the disable the profiles within AB. This will force/hold the clocks you set. Then apply whatever clock setting in AB and it will not go idle. These clocks then will show when FSI checks your settings. The reason is doesn't always show is that the gpu goes into idle clocks while FSI is checking your settings. You'll also see this idle clock in your displayed cpu clock speed too.

I've had the Windows poer settings set to High Performance since I installed it. When you say to disable the profiles in AB, are you talking about the 5 quick set profiles that we manually save?

Also, would you recommend forcing constant voltage? Its one of the only settings that I constantly get two different answers for. I would assume that I wouldnt need to force the voltage because its going to use what it needs when it gets under load.

BTW, what is this Lite edition of this card? It seems to be exactly the same except for the core clock is only 1030. Are there other differences that Im missing?

--> http://www.amazon.com/MSI-R9-290X-LIGHTNING-NEW/dp/B00UGF7Y3K/ref=sr_1_2?ie=UTF8&qid=1427581905&sr=8-2&keywords=msi+r9+290x+lightning

PS: I shouldve bought 4 or 5 of my Lightnings when they were on sale. I bought two and sold one for a profit. Now I wish I kept it

Quote:
Originally Posted by sammarbella View Post

I have the same problem low score on benchs when it must be higher.

Always on performance power mode and force constant voltage in AB.

?¿?


Obviously force constant voltage is useless for this, and lol in most situations imo.

You disable the profiles in the settings where you choose to assign your 2D/3D profiles. If those are set to anything, AB will do as instructed and try to run X profile given the present "load state" so if it's idle while FSI checks your clocks, AB will set the card to idle clocks which shows up as stock clocks. The windoze power options state and the profiles must both be set, because either can change the power state given the load it senses.





I have 4, was thinking of selling one. It's a unique one though with a silly high imc/memory ic, 1750 memory speeds.
The 3930
(26 items)
 
Junior's 3930
(22 items)
 
DATA/HTPC
(20 items)
 
  hide details  
Reply
The 3930
(26 items)
 
Junior's 3930
(22 items)
 
DATA/HTPC
(20 items)
 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AMD/ATI
Overclock.net › Forums › Graphics Cards › AMD/ATI › [Official] MSI R9 290X Lightning Thread