Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [videocardz] AMD to launch 300W GPU with High-Bandwidth-Memory
New Posts  All Forums:Forum Nav:

[videocardz] AMD to launch 300W GPU with High-Bandwidth-Memory - Page 13

post #121 of 772
Quote:
Originally Posted by Chargeit View Post

Once again, heat and noise levels.

doh.gif

Buy a mid range
Workstation
(4 items)
 
  
CPUMotherboardGraphicsMonitor
Xeon E5-2690 Supermicro 2011 Nvidia GP100/ Vega FE Dell ultrasharp 4k 
  hide details  
Reply
Workstation
(4 items)
 
  
CPUMotherboardGraphicsMonitor
Xeon E5-2690 Supermicro 2011 Nvidia GP100/ Vega FE Dell ultrasharp 4k 
  hide details  
Reply
post #122 of 772
Could it be that the 300w comes from a 380x 28nm and the 390x could still possibly hold 20nm ?

I run an R9 290 cooled with full Kryographics front and back water cooled plate.
This has given me superb low temps across the board never reaching 70 °c on gpu or 75 on vrm's
at 1200/1600 clock's and that in a system with an 8350 FX overclocked to 4.4Hz 24/7 use.
cooled with XSPC D5 photon AX 240. + 140 rad.

Long story short...300w doesn't scare me one bit.

I bet an R9 290/290X at 1200/1600 produces over 400W
Edited by Levys - 1/13/15 at 10:06am
post #123 of 772
post #124 of 772
Quote:
Originally Posted by sugarhell View Post

Buy a mid range

look at GTX 980 OC --- 280 W in torture. Yet it pulls 180W in gaming which is pretty impressive.

Also at launch 7870 was very impressive = 110W in gaming...

before you say last word about product wait for release
Edited by Themisseble - 1/13/15 at 10:07am
post #125 of 772
Quote:
Originally Posted by EniGma1987 View Post

It is interesting to see Nvidia's strategy: to reduce the need for high bandwidth as much as possible while optimizing efficiency all around. Then to see AMD's strategy: brute force it all through the highest bandwidth capable memory ever and design the PCB and other components to handle even more heat to keep pushing forward. Once again completely opposite directions of design.

Complete opposite of the early days of Fermi too when Nvidia was the one being criticized for the nuclear power plants known as the 480 and 470. At this point, they should consider releasing high end GPUs with a cord that plugs directly into the wall.
Companion Cube
(12 items)
 
Laptop
(6 items)
 
 
CPUMotherboardGraphicsRAM
i7 4790k GIGABYTE GA-Z97MX-Gaming 5 EVGA GTX 970 ACX 2.0 G.SKILL Ripjaws X Series 16GB (2 x 8GB) 240-Pin... 
Hard DriveCoolingOSMonitor
Crucial MX100 512GB 2.5-Inch SATA III Internal SSD Cooler Master Hyper 212 EVO Windows 10 BenQ BL3201PH 
MonitorPowerCaseAudio
Panasonic VIERA TC-P65VT50 65-Inch  EVGA SuperNOVA 850 G2  Fractal Node 804 Yamaha RX-V377 receiver with Energy RC Micro 5.1 
CPUGraphicsRAMHard Drive
i7 3630QM gtx 675mx 12 GB SSD 
OSMonitor
Windows 7 Panasonic TC-P65VT50 
  hide details  
Reply
Companion Cube
(12 items)
 
Laptop
(6 items)
 
 
CPUMotherboardGraphicsRAM
i7 4790k GIGABYTE GA-Z97MX-Gaming 5 EVGA GTX 970 ACX 2.0 G.SKILL Ripjaws X Series 16GB (2 x 8GB) 240-Pin... 
Hard DriveCoolingOSMonitor
Crucial MX100 512GB 2.5-Inch SATA III Internal SSD Cooler Master Hyper 212 EVO Windows 10 BenQ BL3201PH 
MonitorPowerCaseAudio
Panasonic VIERA TC-P65VT50 65-Inch  EVGA SuperNOVA 850 G2  Fractal Node 804 Yamaha RX-V377 receiver with Energy RC Micro 5.1 
CPUGraphicsRAMHard Drive
i7 3630QM gtx 675mx 12 GB SSD 
OSMonitor
Windows 7 Panasonic TC-P65VT50 
  hide details  
Reply
post #126 of 772
Quote:
Originally Posted by PureBlackFire View Post

TDP is not a strict measure of power consumption, but more related to heat (they are both related). AMD has for the past 3 generations labelled their flagship gpu with 250 watt+ TDP. the fact is the 6970 doesn't consume anywhere close. power consumption is a few watts more than nvidia's gtx560ti, both well under 200 watts. Nvidia has in the past (before kepler/maxwell) been very generous to say the least with how they label their card's TDP with the higher end models actual gaming power consumption being higher than the card's rated TDP. I said "pretty much" because as in typical fashion things progress (we can hope) and this is no longer true, but we cannot ignore recent history altogether. the 7970 also with it's 250 watt TDP consumes under 200 watts, only being around 15 watts more than the gtx680. now the GE is another story. a small overclock bumps heat and power consumption by a relatively fat margin compared to the increase in clocks and performance in my opinion. this is where AMD is stuck right now in the gpu market (GCN). they have a pretty big efficiency problem that as of now is just gettting worse. the point was that TDP=/= power consumption and AMD estimates high very often while nvidia estimates low more often. as MadRabbit pointed out, if you look at the power consumption vs TDP of the gtx 970, you get an idea. the HD7950 has a TDP of 200 watts and the HD7870 was 175 , 130 for the 7850 etc. the 970's stock power consumption is around the same as the 7950, but it has a TDP lower than the 7870 and 7950. the 7850 consumes around 86 watts at load with a 130 watt TDP. it is rare even with maxwell and kepler to see this kind of power consumption vs rated TDP trend on nvidia gpus.
ok. . . i think. (if i have it straight as you explained it) right after i posted i did consider the difference between power draw and heat dissipated that is measured in watts. now since the heat dissipated is "wasted energy" than, as you said yourself, if amd is having efficiency problem - than wouldn't it be entirely reasonable that their tdp would be higher than nvidia's?


Quote:
Originally Posted by Imouto View Post

What about reading the testing methodology?
Quote:
Originally Posted by TPU 
we measure the power consumption of only the graphics card via the PCI-Express power connector(s) and PCI-Express bus slot. A Keithley Integra 2700 digital multimeter with 6.5-digit resolution is used for all measurements. Again, the values here only reflect the card's power consumption as measured at its DC inputs, not that of the whole system.

is there something wrong with isolating the power draw of just the card? that would be more accurate than measuring the whole system, no?
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
post #127 of 772
Quote:
Originally Posted by Nvidia Fanboy View Post

Complete opposite of the early days of Fermi too when Nvidia was the one being criticized for the nuclear power plants known as the 480 and 470. At this point, they should consider releasing high end GPUs with a cord that plugs directly into the wall.

yep but it was after benchmarked...

If "R9 390X" will be only 25% faster than GTX 980 and use in gaming over 250W then... yep...

GTx 980 needs about 180W. If R9 390X is 50% faster then 250W in gaming should normal actually should have better efficiency than GTX 980.

Also next NVIDIA gpu... around 250W TDP and 40-50% boost.
post #128 of 772
There is something that I dont get. Why would their mid range cards pull 300w? We are talking about the 380x right? Where does that leave the 390x? A 400w monster? If its their high end gpu, why is it compare to nvidia mid range 980... gm200 will likely pull over 250w too so for those whining i hope youare not waiting for that card too. The 390x with custom cooler did just fine so i would not be worried, all that it matters is the performence.
post #129 of 772
I didnt catch relase date ??
post #130 of 772
God I am soooooo sick of 28nm!!!
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Rumors and Unconfirmed Articles
Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [videocardz] AMD to launch 300W GPU with High-Bandwidth-Memory