Overclock.net › Forums › Industry News › Hardware News › [Various] GTX Titan X Maxwell reviews
New Posts  All Forums:Forum Nav:

[Various] GTX Titan X Maxwell reviews - Page 129  

post #1281 of 1345
Quote:
Originally Posted by GorillaSceptre View Post

5820k at the least? You will hardly notice a difference in games, if at all.

As for the PCI-E gens, with current GPU's there won't be a difference in games between 2.0 and 3.0, unless you use some GPU compute app that can take advantage of the extra bandwidth.

If he has the budget to build a new system and go SLI Titans then that would be the obvious choice, but if he can only do SLI Titans and the 2500k or upgrade his whole build and get one Titan, then the 2500k build would destroy the latter.

Again, depending entirely on the resolution he is gaming at...

And there can be a significant performance difference between PCI-e 2.0 x8 and PCI-e 3.0 x16 in SLI as Guru3D's testing showed. It would seem that PCI-e 2.0 x16 (3.0 x8) is the minimum for optimal performance.
Glorious 4K
(23 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930k @ 4.625 GHz ASUS X99-A USB 3.1 EVGA GTX 1080 Ti @ 1950/12200 Corsair Vengeance LPX 32GB DDR4-2666 C15 
Hard DriveHard DriveHard DriveCooling
OCZ Agility 3 240GB Seagate 600 Pro 240 GB WD Black 4TB Corsair H110 
CoolingCoolingCoolingCooling
Noctua NF-A14 x2 Corsair SP140 x2 Corsair AF120 x1 Corsair SP120 x4 
CoolingOSMonitorKeyboard
EVGA Titan X Hybrid AIO Windows 10 Professional x64 Sharp Aquos 4K LC-60UD27U Logitech K520 
PowerCaseMouseAudio
Corsair AX1200 Corsair Obsidian 450D Logitech Wireless Gaming Mouse G700 Sound Blaster Z Sound Card 
AudioAudioOther
Logitech Z906 Speakers Razer Chimaera 5.1 Headphones Sunbeam Rheosmart 6 fan controller 
  hide details  
Glorious 4K
(23 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930k @ 4.625 GHz ASUS X99-A USB 3.1 EVGA GTX 1080 Ti @ 1950/12200 Corsair Vengeance LPX 32GB DDR4-2666 C15 
Hard DriveHard DriveHard DriveCooling
OCZ Agility 3 240GB Seagate 600 Pro 240 GB WD Black 4TB Corsair H110 
CoolingCoolingCoolingCooling
Noctua NF-A14 x2 Corsair SP140 x2 Corsair AF120 x1 Corsair SP120 x4 
CoolingOSMonitorKeyboard
EVGA Titan X Hybrid AIO Windows 10 Professional x64 Sharp Aquos 4K LC-60UD27U Logitech K520 
PowerCaseMouseAudio
Corsair AX1200 Corsair Obsidian 450D Logitech Wireless Gaming Mouse G700 Sound Blaster Z Sound Card 
AudioAudioOther
Logitech Z906 Speakers Razer Chimaera 5.1 Headphones Sunbeam Rheosmart 6 fan controller 
  hide details  
post #1282 of 1345
Quote:
Originally Posted by BigMack70 View Post

Again, depending entirely on the resolution he is gaming at...

And there can be a significant performance difference between PCI-e 2.0 x8 and PCI-e 3.0 x16 in SLI as Guru3D's testing showed. It would seem that PCI-e 2.0 x16 (3.0 x8) is the minimum for optimal performance.

If you're buying TWO Titan x's i think it's pretty safe to assume he's not gaming at 1080p..

I haven't seen any significant performance hits between the 2 gens, whether a few fps is worth the cost of upgrading his CPU and Mobo, is entirely up to him.
post #1283 of 1345
my main question is if this card bottlenecks a FX 9590 straight out the box?


I am just thinking of price performance on using a Asrock 970 Performance/9590 and a Titan together in one setup/
    
CPUMotherboardGraphicsRAM
Athlon X4 950 Asrock A320M-HDV Asus GT 1030 8GB Samsung DDR4 2133 
Hard DriveHard DriveOSMonitor
SanDisk SSD Plus 120GB Western Digital IntelliPower 1TB Windows 10 X64 Enterprise LTSB Dell 32 Ultra-Wide IPS Monitor 
Case
Rosewill SCM-01 
  hide details  
    
CPUMotherboardGraphicsRAM
Athlon X4 950 Asrock A320M-HDV Asus GT 1030 8GB Samsung DDR4 2133 
Hard DriveHard DriveOSMonitor
SanDisk SSD Plus 120GB Western Digital IntelliPower 1TB Windows 10 X64 Enterprise LTSB Dell 32 Ultra-Wide IPS Monitor 
Case
Rosewill SCM-01 
  hide details  
post #1284 of 1345
Quote:
Originally Posted by dlee7283 View Post

my main question is if this card bottlenecks a FX 9590 straight out the box?


I am just thinking of price performance on using a Asrock 970 Performance/9590 and a Titan together in one setup/

oh man now you opened pandoras box.. lol
post #1285 of 1345
Quote:
Originally Posted by GorillaSceptre View Post

If you're buying TWO Titan x's i think it's pretty safe to assume he's not gaming at 1080p..

I haven't seen any significant performance hits between the 2 gens, whether a few fps is worth the cost of upgrading his CPU and Mobo, is entirely up to him.

It depends on the game, but yes BigMack is right in that Guru3D found that going from PCIe 1.1 x16 (the same as PCIe 2.0 x8) to PCIe 3.0 x16 with 2x 980, FPS increased by 25% in Thief.

index.php?ct=articles&action=file&id=14882&admin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1

Alien Isolation also showed a 10% increase, but here you could argue at 131 FPS any increase is pretty much superfluous.

img[]
post #1286 of 1345
Quote:
Originally Posted by magnek View Post

It depends on the game, but yes BigMack is right in that Guru3D found that going from PCIe 1.1 x16 (the same as PCIe 2.0 x8) to PCIe 3.0 x16 with 2x 980, FPS increased by 25% in Thief.

index.php?ct=articles&action=file&id=14882&admin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1

Alien Isolation also showed a 10% increase, but here you could argue at 131 FPS any increase is pretty much superfluous.

Straight from their article:

"Concerning SLI, look in-between PCIe Gen 2.0 and 3.0; you are looking at 3 to 5% performance differences. That is still significant, but not such a huge number to the extent that you should even be slightly worried at all. Remember, if after reading this review PCIe Gen 3.0 seems appealing to you then our advice is simple, investing in a faster graphics card will bring you more FPS opposed to upgrading your processor and motherboard. Remember this as well, to achieve Generation 3.0 bandwidth you will need the magic three combo, a 3.0 compatible processor, a 3.0 compatible motherboard and of course a gen 3.0 compatible graphics card"

That pretty much sums up what I've been saying..
post #1287 of 1345
Not sure if this is as relevant, but i'm not sure i buy into this PCIE/CPU is king crap myself.

cut from another post i made
So i was trying to test if amd 290x triple 4k eyefinity could compare to nvidia triple 4k surround. I haven't seen much of anything to compare until i saw the titan x review thread. Apparently TPU did a 4k surround benchmark simple it may be but something i can try to compare with. I only ran one test for far and it has me wondering what to think. Titan X is obviously a fast card and meant to be. However 980gtx is somewhere around a 290x? This isn't an apples to apple comparison but it something tangible.

Source for TPU's bench

CPU: Intel Core i7 5820K processor w/Corsair H110 cooler
GIGABYTE X99 Gaming G1 Wi-F
16GB Corsair Vengeance 2666MHz DDR4
Windows 7 Ultimate x64




My setup
EVGA SR2 - PCIE 2.0
R9 290x Reference x2 CF
AMD 15.3 beta drivers
Intel x5650 xeon OC 4.0/4.4
24GB DDR3, @1600mhz
Win7 Pro 64bit




My setup is considerably older, but the reolution should make it more GPU dependant. Even still there should be a small percentage increase on PCIE 3.0, and possibly newer faster CPU/RAM/Platform maybe. I could run the cpus at stock but i doubt it will make much difference at that resolution. I don't think i missed anything but if i did correct me. Anyways, enjoy it for what it is
post #1288 of 1345
Quote:
Originally Posted by GorillaSceptre View Post

Straight from their article:

"Concerning SLI, look in-between PCIe Gen 2.0 and 3.0; you are looking at 3 to 5% performance differences. That is still significant, but not such a huge number to the extent that you should even be slightly worried at all. Remember, if after reading this review PCIe Gen 3.0 seems appealing to you then our advice is simple, investing in a faster graphics card will bring you more FPS opposed to upgrading your processor and motherboard. Remember this as well, to achieve Generation 3.0 bandwidth you will need the magic three combo, a 3.0 compatible processor, a 3.0 compatible motherboard and of course a gen 3.0 compatible graphics card"

That pretty much sums up what I've been saying..

That's gen 2 x16 Guru3D is referring to. Here we're talking about gen 2 x8, which is equivalent to gen 1.1 x16, and the difference is most definitely larger than 3-5% -- it's actually 25% for that particular game.

In any case, my point is simply that for "most games" you "will probably" be fine. But if you want to cover those edge cases and worst case scenarios, then gen 2 x8 simply won't cut it anymore.
post #1289 of 1345
Quote:
Originally Posted by magnek View Post

That's gen 2 x16 Guru3D is referring to. Here we're talking about gen 2 x8, which is equivalent to gen 1.1 x16, and the difference is most definitely larger than 3-5% -- it's actually 25% for that particular game.

In any case, my point is simply that for "most games" you "will probably" be fine. But if you want to cover those edge cases and worst case scenarios, then gen 2 x8 simply won't cut it anymore.

I missed that thumb.gif Not denying there isn't differences, you also can't directly say that gen 1.1 x16 = gen 2.0 x8, the bandwidth has doubled but there are differences in latency, which may skew the results in some games. In any case, i would pick a 2500k gen 2 system with 2 Titans any day. But if he can afford to build a new system and go SLI Titans, then that's the best case.
post #1290 of 1345
Quote:
Originally Posted by magnek View Post

That's gen 2 x16 Guru3D is referring to. Here we're talking about gen 2 x8, which is equivalent to gen 1.1 x16, and the difference is most definitely larger than 3-5% -- it's actually 25% for that particular game.

In any case, my point is simply that for "most games" you "will probably" be fine. But if you want to cover those edge cases and worst case scenarios, then gen 2 x8 simply won't cut it anymore.

lol. Its an edge case and the performence is still above refresh rate... I would not buy 2 titan x on a 2500k, but then again i would not get two titan to begin with.

Cf 290 with a 6 core haswell a new case and mobo and an ssd and a full watercooling loop and and some poket change is 2000$ better used then two titan x that wilk be obsolete in a year when 16nm and hmb get out in the market.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
This thread is locked  
Overclock.net › Forums › Industry News › Hardware News › [Various] GTX Titan X Maxwell reviews