Originally Posted by Blameless
I've been blaming PowerTune for performance issues since the release of the 6000 series. It's one of the most annoying features imaginable.
Throttling is bad, TDP metering is bad.
Build parts with robust VRMs, and spend more on coolers, thanks. I'll happily pay the extra 10 dollars it adds to the BoM.
AMD has better VRM than Nvidia. It's just that AMD reused HD7970GE VRM for R9 290 series and while they are similar pci-e power rated TDP they don't have actual same power consumption or heat output or heat density (per mm2).
AMD VRM is on par with Galaxy GTX 780 HOF. That's why AMD reference 7970 isn't all that bad.
TDP metering is fine as long as it isn't the only limit. The power in watts determines the heat load. AMD had a good idea to use fan speed as another limitation , it's just the cooler is crap.
I think honestly nobody cares about fan % or speed, just dB (and sound charactersitics) at load and temps. If you know the tolerable dB (of a non-ref cooler) then extrapolate the fan speed and set it as max fan speed (along with a thermal limit), then Powertune is a great idea. The problem with using % fan speed is some manufacturers use fans with different speed ranges , so one may be 4000 max RPM and one may be 6000 max RPM , resulting in a difference of 800RPM.
The problem is the R9 290 series is inherently a hotter chip due to the higher power use (GTX 780 uses ~250W max at stock). It has less die area too and the stock cooler is massively underpowered for that heat load. Only aftermarket cooling brings out the best in it.
Also, vapor chambers are only for spreading heat not dissipating it. Heatpipes on the other hand are massively effective since they incorporate phase change cooling and increase the surface area like fins do. Vapor chambers are like a flattened heatpipe that transfers heat from die to the fin stack and heatpipes (if it's not direct contact heatpipes).
We can expect R9 290 aftermarket cooled cards to be awesome (below 80°C even under furmark) provided they don't jack up prices or use cheap VRM (XFX...).
The official explanation for the numbers we reported in our R9 290X and GeForce GTX 780 Ti reviews is that Radeon R9 290X uses PWM to control fan speed. The fan it uses purportedly has a +/-300 RPM margin. So, 40% does not necessarily mean the same rotational speed on every card. But AMD claims it was not prepared to see variance as severe as what we saw
With the exception of the press card shooting up about 60 RPM for no clear reason, AMD's new driver successfully gets all three cards running around 2200 RPM. I don't have the same mic setup as Igor here in my lab, so I can't give you a good noise comparison. However, the card is noticeably louder than it was at 2050 RPM.
R9 280X cooler on R9 290
Update: Based on your feedback, I took the IceQ X2 cooler off the HIS Radeon R9 280X and stuck it on our R9 290 sample. Cooling was dramatically improved. The FurMark stress test maxed out at 76 degrees while the card never exceeded 63 degrees in Crysis 3 and Battlefield 4. So it seems as expected the board partners will be able to solve the heat issues of the reference card.
Arctic Accelero Xtreme III at 7V = ~80°C , see http://www.computerbase.de/artikel/grafikkarten/2013/amd-radeon-r9-290-im-test/10/
stock clocks and fan speeds
85'C in heaven on stock cooler
55'C in heaven on MK-26 cooler
Overclocked at 1200 Core + 1.4V
stock cooler at 85'c with 100% fan speed
MK-26 cooler at 72'C with silent fans.
Gelid Icy Vision Rev 2 (~$40)
Max temp in 3DMark11 66C
Its knocked off ~40C load temps on my 290x and the fans are inaudible now.
I installed the EK copper waterblock on my HIS 290X yesterday and I tested the overclock :
- on air with 100 % fan stock volts , the card was able to run every benchmark at 1130/1650 Mhz (Elpdia memory modules). Games were stable at 1100/6400 Mhz.
- under water :
- stock volts : 1130/1650 Mhz : no overclocking improvement but what a silence !
- 1410 mV : 1260/1650 Mhz stable on every benchmark but I have several remarks :
- I suspect the GPU tweak voltage inaccurate : I check realtime temperatures, voltages, frequencies with the Logitech G19 display and AIDA64 :at 1410 mV, the actual GPU voltage is around 1.28 - 1.29 V and it fluctuates, depending on the display scene. I think that AIDA voltages are accurate since GPU and VRM temperatures are not very high, 43°C and 62°C respectively on 3DMark Vantage.
- at 1260 Mhz and whatever the RAM frequency, the display is corrupted, not artifacted : the display fluctuates between normal and blurred. If I lower the GPU frequency, at 1230, I have the same problem. Tested on every benchmark and Crysis 3. The characters are blurry, unreadable, the drawings are in 640x480 mode. Then If I reboot the computer, the image is crisp again. I tested evey bench and Crysis 3 without overclocking, no problem. I migrated the driver from 13.11beta 6 to beta7 and no change. Since the GPU frequency change does not solve my problem, I assume that memory is responsible for or the memory controller. Would it be possible to increase memory voltage ? Otherwise, would you have an idea ?
I flashed my HIS card with ASUS bios without any problems which is strange since my card has Elpida modules whereas ASUS ones are said to have Hynix memory.
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/1530#post_21081290Edited by AlphaC - 11/8/13 at 11:53am
Got a 3dmark11 performance run in. I need more voltage! This asus rom maxes out at 1410mV.
This score was done with 1228 / 1568 (6276) at 1.4V. maxed out at 47*C. CPU is a 4.9ghz 8350