Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [TT] NVIDIA should launch its next-gen Pascal GPUs with HBM2 in 2H 2016
New Posts  All Forums:Forum Nav:

[TT] NVIDIA should launch its next-gen Pascal GPUs with HBM2 in 2H 2016 - Page 19

post #181 of 724
Quote:
Originally Posted by criminal View Post

Great video. Thanks for posting.

Another reason its in your best interest to back AMD. I'm not saying they havent made poor decisions that put them where they are, but giving nvidia your money because they put out a few more FPS(which this video shows in alot of games isnt even real) is bad for you. The little guy is going to be the one that innovates. Why? Because they have to. Take AMDs response to Gameworks, completely opensource.

Take a look at HBM, a game changer for GPUs. Has nvidia been working on it? You bet they have. Have they released a card with HBM? Nope, because they haven't needed too. They've basically been sitting on it till they need a boost to put their cards past AMD. This is why, these best performance per $$$ videos all around youtube are bad. It would be the equivalent to deciding on whether you were to buy the next battlefield or COD based solely off which played at a higher framerate on your PC.
post #182 of 724
Quote:
Originally Posted by magnek View Post

I wouldn't say nVidia "can't", more like they have 0 financial incentive to do so, and thus "won't".

If AMD pulled a miracle and solved their DX11 CPU overhead issue overnight, and made 290X on par with 980 at every resolution (but especially 1080p), you bet your ass nVidia will scramble night and day to tweak the last iota of performance out of Maxwell AND Kepler.

Maxwell certainly, I doubt Kepler would get so much love, at least straight away. But you may well be right smile.gif

Its still a valid point I think to ask what will happen to GCN in the months after Polaris release.
post #183 of 724
Quote:
Originally Posted by Slink3Slyde View Post

Well technically Kepler didnt lose any performance, we just didnt get included so much in the game optimizations. Although it did suck that it happened in the games literally the week after Maxwell was released. I looked into it quite a bit at the time with numbers from TPU's bench suite in games released before and after Maxwell, the reference 970 was 10% ahead of the 780 in older games and 20% in the newer ones with a roughly even split between games released before and after. Apparently they've done something about it, but its hard to tell. We are talking about 5-10% performance tweaking here, and theyre not breaking their necks getting Kepler cards up to date for the newest games going by what I've seen recently. That's not to say theyre unsupported at all.

Kepler performance wrt competition fell. At the GTX 980 / GTX 970 launch the 780 competed with 290 and the 780 OC was on par with R9 290 OC.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/26.html

Today the 290 is clearly the superior card.

http://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/23.html
Quote:
It would also though be fair to wonder with AMD's more limited resources, and with the time between their driver releases as it is. How are all the GCN cards going to fair after Polaris' release? Can they afford to support 2 archictectures fully if Nvidia cant?

Polaris is 4th generation GCN. So even though there are architectural enhancements to improve shader efficiency its still based on GCN architecture. btw its not that Nvidia cannot support two architectures. Nvidia has in the past supported their GPU architectures for longer time periods than AMD. Eg: Fermi . But Nvidia has now chosen to push sales of its latest gen by choosing to neglect driver improvements of previous gen. Gameworks is a part of that strategy as Nvidia's previous gen cards get hammered in Gameworks titles compared to latest gen.
Fragbox
(14 items)
 
  
CPUMotherboardGraphicsGraphics
Intel Core i5 2400 DH67BL AMD Radeon HD 6900 Series XFX HD 6950 2GB 
RAMRAMHard DriveOptical Drive
Corsair Corsair DDR3 1333 Mhz 2 GB Corsair DDR3 1333 Mhz 2 GB Western Digital Caviar Green SONY DVD-RW AD-7260S 
OSMonitorKeyboardPower
Windows 7 Professional 32 bit BENQ G2420HD Logitech K200 Seasonic VX550 psu 
CaseMouse
ANTEC 200 V2 Logitech mouse 
  hide details  
Reply
Fragbox
(14 items)
 
  
CPUMotherboardGraphicsGraphics
Intel Core i5 2400 DH67BL AMD Radeon HD 6900 Series XFX HD 6950 2GB 
RAMRAMHard DriveOptical Drive
Corsair Corsair DDR3 1333 Mhz 2 GB Corsair DDR3 1333 Mhz 2 GB Western Digital Caviar Green SONY DVD-RW AD-7260S 
OSMonitorKeyboardPower
Windows 7 Professional 32 bit BENQ G2420HD Logitech K200 Seasonic VX550 psu 
CaseMouse
ANTEC 200 V2 Logitech mouse 
  hide details  
Reply
post #184 of 724
Quote:
Originally Posted by Bryst View Post

Another reason its in your best interest to back AMD.

AMD is a bad idea. Mantle is a huge fail. I was all aboard the Mantle Train and the performance gains. Then nobody was using it except DAI and BF Why? Because Nvidia PAYS $$$ for games to use Gameworks. AMD is broke and can't afford the bribes. If AMD forked over the Moolah for devs to implement Mantle into their games then it would be worth it. Therefore we are stuck with Gameworks. Why GIMP yourself with an AMD card during the Gameworks Era?
post #185 of 724
Quote:
Originally Posted by gamervivek View Post

Pascal is just Maxwell die shrink with HBM + compute improvement, certainly not a bigger change than kepler to maxwell. Heck, Pascal didn't even exist a couple of years back.

It is quite opposite. Pascal is total redesign of chip architecture since NV targets with it an emerging market which is much bigger than graphics. Think about self-driving cars, robots, vision, artificial intelligence based on a recent breakthrough called deep learning. To achieve this Pascal has very advanced support for floating-point arithmetic operations in 64-bit, 32-bit and 16-bit. This is likely a reconfigurable architecture with graphics processing being a special case. For non-specialists, the sign how much Pascall will be different from Maxwell is that in Pascal 64-bit arithmetic will have prominent role like in earlier generation of chips and new short 16-bit arithmetics will be added.
post #186 of 724
Quote:
Originally Posted by BeerPowered View Post

AMD is a bad idea. Mantle is a huge fail. I was all aboard the Mantle Train and the performance gains. Then nobody was using it except DAI and BF Why? Because Nvidia PAYS $$$ for games to use Gameworks. AMD is broke and can't afford the bribes. If AMD forked over the Moolah for devs to implement Mantle into their games then it would be worth it. Therefore we are stuck with Gameworks. Why GIMP yourself with an AMD card during the Gameworks Era?

Someone with your post count should be more educated about what Mantle did.

Unless you mean that Mantle failed to remain a proprietary AMD benefit, in which case yeah, I see your point.
Parasite
(18 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770K @ 4.7GHz Z87 MPOWER (MS-7818) Sapphire Radeon 290x @1100/1500 EVGA 1080Ti SC2 Hybrid 
RAMHard DriveHard DriveCooling
G.SKILL 2133 Samsung 850 Pro Caviar Black Corsair H100 
CoolingCoolingOSMonitor
Corsair HG10 Corsair H60 Windows 7 x64 Sony XBR65X850B 
KeyboardPowerCaseMouse
CMSTORM Quickfire XT Corsair AX1200i Antec P280 Logitec G700 
Mouse PadAudio
Black, came with my NeXTcube 25 years ago. Sound Blaster Recon 3D PCIe 
  hide details  
Reply
Parasite
(18 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770K @ 4.7GHz Z87 MPOWER (MS-7818) Sapphire Radeon 290x @1100/1500 EVGA 1080Ti SC2 Hybrid 
RAMHard DriveHard DriveCooling
G.SKILL 2133 Samsung 850 Pro Caviar Black Corsair H100 
CoolingCoolingOSMonitor
Corsair HG10 Corsair H60 Windows 7 x64 Sony XBR65X850B 
KeyboardPowerCaseMouse
CMSTORM Quickfire XT Corsair AX1200i Antec P280 Logitec G700 
Mouse PadAudio
Black, came with my NeXTcube 25 years ago. Sound Blaster Recon 3D PCIe 
  hide details  
Reply
post #187 of 724
Quote:
Originally Posted by raghu78 View Post

Kepler performance wrt competition fell. At the GTX 980 / GTX 970 launch the 780 competed with 290 and the 780 OC was on par with R9 290 OC.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/26.html

Today the 290 is clearly the superior card.

http://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/23.html
Polaris is 4th generation GCN. So even though there are architectural enhancements to improve shader efficiency its still based on GCN architecture. btw its not that Nvidia cannot support two architectures. Nvidia has in the past supported their GPU architectures for longer time periods than AMD. Eg: Fermi . But Nvidia has now chosen to push sales of its latest gen by choosing to neglect driver improvements of previous gen. Gameworks is a part of that strategy as Nvidia's previous gen cards get hammered in Gameworks titles compared to latest gen.

You cant blame Nvidia for AMD improving their performance, and you know it wink.gif

Some people make out that they were gimped in drivers deliberately and lost performance, they weren't they were simply neglected in optimization for newer games in favour of newer cards. From what I can tell anyway, there are plenty of deniers about who will say nothing at all went on but I still believe that's what happened from analyzing review numbers.

It's hard to say with Fermi, it was so far behind little old GK104 that 5-10% probably went unnoticed tbh. I remember the 580 performed a bit behind a 670 or around the old 7950 non boost levels a few years ago. Now I believe it's somewhere around a GTX 660 on a good day, so there's that.

I didn't realize Polaris was still an iteration of GCN, I guess that means they probably won't have that p.r problem then. It's going to be a great year for video cards!
post #188 of 724
Quote:
Originally Posted by raghu78 View Post

Kepler performance wrt competition fell. At the GTX 980 / GTX 970 launch the 780 competed with 290 and the 780 OC was on par with R9 290 OC.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/26.html

Today the 290 is clearly the superior card.

http://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/23.html
Polaris is 4th generation GCN. So even though there are architectural enhancements to improve shader efficiency its still based on GCN architecture. btw its not that Nvidia cannot support two architectures. Nvidia has in the past supported their GPU architectures for longer time periods than AMD. Eg: Fermi . But Nvidia has now chosen to push sales of its latest gen by choosing to neglect driver improvements of previous gen. Gameworks is a part of that strategy as Nvidia's previous gen cards get hammered in Gameworks titles compared to latest gen.

I think this actually highlights one of AMD's most notorious issues perfectly -- terrible launch drivers that leave so much performance on the table. Imagine if AMD launched Hawaii with the drivers they have now, instead of just trading blows with the original Titan and losing to the 780 Ti, it would trade blows with the 780 Ti instead and prevent nVidia from declaring they retook the performance crown. Yeah sure those filthy miner scum would still have caused issues, but it would at least have been a really helpful PR boost for AMD.
post #189 of 724
Quote:
Originally Posted by BeerPowered View Post

AMD is a bad idea. Mantle is a huge fail. I was all aboard the Mantle Train and the performance gains. Then nobody was using it except DAI and BF Why? Because Nvidia PAYS $$$ for games to use Gameworks. AMD is broke and can't afford the bribes. If AMD forked over the Moolah for devs to implement Mantle into their games then it would be worth it. Therefore we are stuck with Gameworks. Why GIMP yourself with an AMD card during the Gameworks Era?

You're only arguing my point. GPU manufacturers shouldn't need to bribe Devs to use stuff like Mantle/Gameworks. If it helps their game perform/look better they should WANT to use it.
post #190 of 724
would you say Fury X is superior to 980Ti if it manages to beat in in 2017?
Quote:
Originally Posted by Bryst View Post

You're only arguing my point. GPU manufacturers shouldn't need to bribe Devs to use stuff like Mantle/Gameworks. If it helps their game perform/look better they should WANT to use it.
whaaat biggrin.gif
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Rumors and Unconfirmed Articles
Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [TT] NVIDIA should launch its next-gen Pascal GPUs with HBM2 in 2H 2016