Overclock.net › Forums › Industry News › Hardware News › [PCPer] Frame Rating: GeForce GTX Titan, GeForce GTX 690, Radeon HD 7990 (HD 7970 CrossFire)
New Posts  All Forums:Forum Nav:

[PCPer] Frame Rating: GeForce GTX Titan, GeForce GTX 690, Radeon HD 7990 (HD 7970 CrossFire) - Page 20

post #191 of 297
Thread Starter 
^^
@ Brutuz
no... the 192-bit interface restricts the full usage of 3GB..
post #192 of 297
Quote:
Originally Posted by malmental View Post

^^
@ Brutuz
no... the 192-bit interface restricts the full usage of 3GB..

No, it doesn't...I did some more maths and here's what happens: nVidia has a 192bit bus, nominally with equally sized memory chips you'd have 768MB, 1.5GB or 3GB, what nVidia has done is gotten the normal chips they used, but then some double sized ones to increase it to 2GB, or double up. (Similar to how I'm running 4 sticks of RAM on 2 channels, if I had a quad channel board and put 5 sticks in then I'd be doing the same thing as the 660Ti is)

Diagram of the vRAM for the 660Ti: GK104Memory.png

The 550Ti is a great example, I suggest you read up on this issue]/url].

If you're saying the GPU isn't strong enough to push those frames...Then we're done here, that myth only applies to the GT640s that come with 2-4GB vRAM, the 660Ti is around as fast/just behind as a HD7950 at stock and the HD7950 can certainly make use of its 3GB easily.
My system
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 3770k @ 4.7Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveOptical Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Samsung Spinpoint EcoGreen 2TB Pioneer DVR-220LBKS 
CoolingCoolingCoolingOS
Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition Arch Linux x86-64, amdgpu 
OSMonitorMonitorKeyboard
Windows 10 Bloatfree Edition BenQ G2220HD BenQ G2020HD Ducky Shine III Year of the Snake, Cherry Blue 
PowerCaseMouseMouse Pad
Silverstone Strider Plus 600w Lian Li Lancool PC-K60 SteelSeries Sensei Professional Artisan Hien Mid Japan Black Large 
AudioOther
ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
My system
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 3770k @ 4.7Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveOptical Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Samsung Spinpoint EcoGreen 2TB Pioneer DVR-220LBKS 
CoolingCoolingCoolingOS
Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition Arch Linux x86-64, amdgpu 
OSMonitorMonitorKeyboard
Windows 10 Bloatfree Edition BenQ G2220HD BenQ G2020HD Ducky Shine III Year of the Snake, Cherry Blue 
PowerCaseMouseMouse Pad
Silverstone Strider Plus 600w Lian Li Lancool PC-K60 SteelSeries Sensei Professional Artisan Hien Mid Japan Black Large 
AudioOther
ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
post #193 of 297
Thread Starter 
Quote:
Originally Posted by Brutuz View Post

Quote:
Originally Posted by malmental View Post

^^
@ Brutuz
no... the 192-bit interface restricts the full usage of 3GB..
Warning: Spoiler! (Click to show)
No, it doesn't...I did some more maths and here's what happens: nVidia has a 192bit bus, nominally with equally sized memory chips you'd have 768MB, 1.5GB or 3GB, what nVidia has done is gotten the normal chips they used, but then some double sized ones to increase it to 2GB, or double up. (Similar to how I'm running 4 sticks of RAM on 2 channels, if I had a quad channel board and put 5 sticks in then I'd be doing the same thing as the 660Ti is)

Diagram of the vRAM for the 660Ti: GK104Memory.png

The 550Ti is a great example, I suggest you read up on this issue]/url].

If you're saying the GPU isn't strong enough to push those frames...Then we're done here, that myth only applies to the GT640s that come with 2-4GB vRAM, the 660Ti is around as fast/just behind as a HD7950 at stock and the HD7950 can certainly make use of its 3GB easily.
funny.... using your link one of the first things I read is this..
so I think your math not sure if correct or not but your off on the reason I said what I said.. rolleyes.gif
Quote:
The best case scenario is always going to be that the entire 192bit bus is in use by interleaving a memory operation across all 3 controllers, giving the card 144GB/sec of memory bandwidth (192bit * 6GHz / 8). But that can only be done at up to 1.5GB of memory; the final 512MB of memory is attached to a single memory controller. This invokes the worst case scenario, where only 1 64-bit memory controller is in use and thereby reducing memory bandwidth to a much more modest 48GB/sec.

How NVIDIA spreads out memory accesses will have a great deal of impact on when we hit these scenarios. In the past we’ve tried to divine how NVIDIA is accomplishing this, but even with the compute capability of CUDA memory appears to be too far abstracted for us to test any specific theories. And because NVIDIA is continuing to label the internal details of their memory bus a competitive advantage, they’re unwilling to share the details of its operation with us. Thus we’re largely dealing with a black box here, one where poking and prodding doesn’t produce much in the way of meaningful results.

As with the GTX 550 Ti, all we can really say at this time is that the performance we get in our benchmarks is the performance we get. Our best guess remains that NVIDIA is interleaving the lower 1.5GB of address while pushing the last 512MB of address space into the larger memory bank, but we don’t have any hard data to back it up. For most users this shouldn’t be a problem (especially since GK104 is so wishy-washy at compute), but it remains that there’s always a downside to an asymmetrical memory design. With any luck one day we’ll find that downside and be able to better understand the GTX 660 Ti’s performance in the process.
post #194 of 297
Quote:
Originally Posted by malmental View Post

funny.... using your link one of the first things I read is this..
so I think your math not sure if correct or not but your off on the reason I said what I said.. rolleyes.gif

They are talking about the 2GB card, which is the strange configuration on the 192-bit bus. Theoretically, the 3GB card runs just fine on the 192-bit bus - although no one seems to know for sure how they do the memory management on the 3GB card.
post #195 of 297
Thread Starter 
Testing For Memory Interface Limitations: 1920x1080
Quote:
We’d also like to say a few words about minimum frame rates in this benchmark. Nvidia’s GeForce cards lose the race big time, no matter what anti-aliasing settings we use. Subjectively, a single GeForce GTX 660 Ti under the effects of 8x MSAA is worse than the already-bad CrossFire setup. That's right: Nvidia's stuttering under those settings is more annoying than the micro-stuttering typical of many CrossFire arrays. And this is in spite of our efforts to pick settings that yield playable frame rates on all cards at every setting. We really can’t recommend Nvidia's GeForce GTX 660 Ti if you plan to use 4x or 8x MSAA; even two Radeon HD 7750s are a better choice.

Testing For Memory Interface Limitations: 2560x1440
Quote:
Originally, we wanted use a multi-monitor setup. But we figured out that just wouldn't make sense after running a few tests. There are some things the GeForce GTX 660 Ti can’t handle, and Surround is one of them. So, we set up our biggest monitor, which, based on the previous page, we suspect will cause the card some trouble anyway. Most people shopping for a $300 graphics card are probably playing at 1920x1080, so 2560x1440 is more of a theoretical exercise anyway.
Quote:
The GeForce GTX 670 holds onto its performance crown at 4x MSAA by one frame per second. However, AMD's Radeon HD 7870 beats the GeForce GTX 660 Ti decisively.
Quote:
At 8x MSAA, Batman: Arkham City is AMD's game. The Radeon HD 7950 and 7870 come in first and second place. Nvidia's GeForce GTX 660 Ti’s minimum frame rates, subjectively speaking, affect this title's experience negatively at these settings.

bottom line is the GTX 660 Ti is not capable of pushing out 3GB of VRAM....
the 192-bit interface is a detriment at higher resolutions and high levels of AA is death.
I feel like deja vu around here today..rolleyes.gif

edit:
and the 3GB version is slower than the 2GB version...
tongue.gif
Quote:
The gap between the 2 and 3 GB versions of Nvidia's GeForce GTX 660 Ti is even larger when we apply 8x MSAA. Lesson learned: spending extra on 3 GB is pointless when capacity isn't the problem.



Brutuz - you need to stick to Radeon dude....
it's a good thing for everyone that Radeon is better @ Folding but for you specifically it's a good thing.
for you brother have no clue about what's going on here...
Edited by malmental - 3/31/13 at 6:26pm
post #196 of 297
Quote:
Originally Posted by malmental View Post

funny.... using your link one of the first things I read is this..
so I think your math not sure if correct or not but your off on the reason I said what I said.. rolleyes.gif
...Did you even read it?

192bit cards native RAM (ie. when you have an equal amount of equally sized chips on each memory controller) is either 384MB, 768MB, 1.5GB or 3GB, nVidia doubled up on some of the RAM chips to get 2GB on the 660Ti, meaning that without any interleaving 1.5GB will be at full speed, but that last 512MB will be at the max speed of one memory controller, 3GB would have an equal number of equal sized chips on each memory controller meaning each runs at full speed...a 3GB 660Ti is probably going to be a little faster than a 2GB one assuming all else is equal, actually. It wouldn't make any sense in nVidia's lineup (It'd have more vRAM than all but the most expensive GTX 680s/670s, or lower their performance by having an uneven amount of vRAM on those two cards to match it)
Quote:
Originally Posted by malmental View Post

Testing For Memory Interface Limitations: 1920x1080
Testing For Memory Interface Limitations: 2560x1440


bottom line is the GTX 660 Ti is not capable of pushing out 3GB of VRAM....
the 192-bit interface is a detriment at higher resolutions and high levels of AA is death.
I feel like deja vu around here today..rolleyes.gif

edit:
and the 3GB version is slower than the 2GB version...
tongue.gif


Brutuz - you need to stick to Radeon dude....
it's a good thing for everyone that Radeon is better @ Folding but for you specifically it's a good thing.
for you brother have no clue about what's going on here...

Firstly, the 660Ti is memory interface limited...But the fact remains it's still around as fast as a HD7950 which can use its extra vRAM, it could use the extra vRAM but would still lose performance.
Secondly, running out of vRAM while the cores hungry means that it's going to be loading from the much slower HDD, the 660Ti would still be able to pull playable FPS, especially with a good vRAM OC while benefiting from the extra vRAM. (See: Skyrim with lots of texture mods. It doesn't load them all at once but a lot of HDD/SSD reads are saved by preloading textures into the vRAM even if they're not being used, then just grabbing them when necessary)
Thirdly, nVidia will have optimisations in the drivers to make up for 512MB of the 660Tis vRAM being at 64bit bus speeds, the 3GB card will likely be doing all of that without any need on the Galaxy 3GB card, which has 1024MB on each MC vs the 2GB cards with have 512MB on 2 MCs and 1024MB on the 3rd MC.
And finally, even if nVidia does have allow for 3GB 660Tis without those optimizations (I doubt it) then what are the actual RAM chips onboard? What's the betting Galaxy has used cheap chips that run looser timings in order to keep their margins on that card decent?

You're arguing against common sense here...You don't see many 3GB 256bit cards because it's not easily done with maximum performance, without optimisation (That would hurt performance for a 4GB version of the same card) then you're going to get 2GB at full speed and 1GB at half speed.
"Because NVIDIA has disabled a ROP partition on GK104 in order to make the GTX 660 Ti, they’re dropping from a power-of-two 256bit bus to an off-size 192bit bus. Under normal circumstances this means that they’d need to either reduce the amount of memory on the card from 2GB to 1.5GB, or double it to 3GB. The former is undesirable for competitive reasons (AMD has 2GB cards below the 660 Ti and 3GB cards above) not to mention the fact that 1.5GB is too small for a $300 card in 2012. The latter on the other hand incurs the BoM hit as NVIDIA moves from 8 memory chips to 12 memory chips, a scenario that the lower margin GTX 660 Ti can’t as easily absorb, not to mention how silly it would be for a GTX 680 to have less memory than a GTX 660 Ti."

"Of course at a low-level it’s more complex than that. In a symmetrical design with an equal amount of RAM on each controller it’s rather easy to interleave memory operations across all of the controllers, which maximizes performance of the memory subsystem as a whole. However complete interleaving requires that kind of a symmetrical design, which means it’s not quite suitable for use on NVIDIA’s asymmetrical memory designs. Instead NVIDIA must start playing tricks. And when tricks are involved, there’s always a downside.

The best case scenario is always going to be that the entire 192bit bus is in use by interleaving a memory operation across all 3 controllers, giving the card 144GB/sec of memory bandwidth (192bit * 6GHz / 8). But that can only be done at up to 1.5GB of memory; the final 512MB of memory is attached to a single memory controller. This invokes the worst case scenario, where only 1 64-bit memory controller is in use and thereby reducing memory bandwidth to a much more modest 48GB/sec."

Read those quotes, please, before you comment again...nVidia will have done optimisations/tricks to make up for the asymmetrical memory which is why the 3GB card there won't perform as well, the drivers see it's a 660Ti and run that through nVidia's codepath for the 660Ti.
My system
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 3770k @ 4.7Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveOptical Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Samsung Spinpoint EcoGreen 2TB Pioneer DVR-220LBKS 
CoolingCoolingCoolingOS
Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition Arch Linux x86-64, amdgpu 
OSMonitorMonitorKeyboard
Windows 10 Bloatfree Edition BenQ G2220HD BenQ G2020HD Ducky Shine III Year of the Snake, Cherry Blue 
PowerCaseMouseMouse Pad
Silverstone Strider Plus 600w Lian Li Lancool PC-K60 SteelSeries Sensei Professional Artisan Hien Mid Japan Black Large 
AudioOther
ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
My system
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 3770k @ 4.7Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveOptical Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Samsung Spinpoint EcoGreen 2TB Pioneer DVR-220LBKS 
CoolingCoolingCoolingOS
Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition Arch Linux x86-64, amdgpu 
OSMonitorMonitorKeyboard
Windows 10 Bloatfree Edition BenQ G2220HD BenQ G2020HD Ducky Shine III Year of the Snake, Cherry Blue 
PowerCaseMouseMouse Pad
Silverstone Strider Plus 600w Lian Li Lancool PC-K60 SteelSeries Sensei Professional Artisan Hien Mid Japan Black Large 
AudioOther
ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
post #197 of 297
Quote:
Originally Posted by Aluc13 View Post

My understanding of this is that single card solutions are always better than multi gpu support. But, this has been known for years has it not? So, I don't quite understand the bickering that goes on. People like their respective brands and will protect those brands. It shouldn't matter what brand we all pick, it's like the console wars. I don't like 360 so I don't buy 360. Plain and simple. Let's keep it civil, guys.

A true word of wisdom that will sadly fall on deaf ears.
post #198 of 297
Quote:
Originally Posted by SniperTeamTango View Post

Its convenient that nv people will argue price performance for 7950 vs 660ti/670, but when its titan vs 7990 hell no.

That's probably because the 7990 and the Titan have completely different selling points. People buy them for different reasons. It makes no sense to compare them in price/perf (beyond just acknowledging what the situation is) when NV already has a card that much better compares to the 7990 and has better price/perf than the Titan. The 690 is what the 7990 should be compared to.
 
Benching
(17 items)
 
 
CPUMotherboardGraphicsRAM
[i7 5960X @ 4.8GHz] [Rampage V Extreme] [Titan 1400MHz (1500MHz bench)] [Various] 
Hard DriveCoolingCoolingCooling
[250GB 840EVO +2x SpinpointF3 1TB RAID0] [LD PC-V2 SS Phase Change] [XSPC X2O 750 pump/res] [Monsta 360 full copper + EK XT 360 + XT 240] 
MonitorPowerCaseAudio
[Crossover 27Q LED-P 1440p+ASUS 1200p+LG 1080p] [Corsair AX1200] [Dimastech Easy v3.0] [Sennheiser HD558s] 
CPUCPUMotherboardGraphics
FX 8320, FX 8350, Phenom II x2 555BE i7 3930K, i7 860, i7 4770K, 68x Celeron D CVF, commando, 2x RIVE, Z87X-OC Asus 4870x2, Sapphire 4870 
GraphicsGraphicsGraphicsGraphics
2x 5870, 5850, 5830, 5770 2x 3870x2, 3870 GTX Titan, GTX 480, GTX 590 GTX 285, GTX 260, 4x 9800GT, 8800GTX 
RAMHard DriveCoolingCooling
4x4GB vengeance, 2x4GB predatorX, 2x1GB OCZ DDR2 Intel X25-M 80GB LD PC-V2 SS Phase Change OCN Marksman 
CoolingCoolingOSPower
2x old tek slims (GPU) Various watercooling stuff win7, winxp AX1200 
Case
test bench / cardboard box 
  hide details  
Reply
 
Benching
(17 items)
 
 
CPUMotherboardGraphicsRAM
[i7 5960X @ 4.8GHz] [Rampage V Extreme] [Titan 1400MHz (1500MHz bench)] [Various] 
Hard DriveCoolingCoolingCooling
[250GB 840EVO +2x SpinpointF3 1TB RAID0] [LD PC-V2 SS Phase Change] [XSPC X2O 750 pump/res] [Monsta 360 full copper + EK XT 360 + XT 240] 
MonitorPowerCaseAudio
[Crossover 27Q LED-P 1440p+ASUS 1200p+LG 1080p] [Corsair AX1200] [Dimastech Easy v3.0] [Sennheiser HD558s] 
CPUCPUMotherboardGraphics
FX 8320, FX 8350, Phenom II x2 555BE i7 3930K, i7 860, i7 4770K, 68x Celeron D CVF, commando, 2x RIVE, Z87X-OC Asus 4870x2, Sapphire 4870 
GraphicsGraphicsGraphicsGraphics
2x 5870, 5850, 5830, 5770 2x 3870x2, 3870 GTX Titan, GTX 480, GTX 590 GTX 285, GTX 260, 4x 9800GT, 8800GTX 
RAMHard DriveCoolingCooling
4x4GB vengeance, 2x4GB predatorX, 2x1GB OCZ DDR2 Intel X25-M 80GB LD PC-V2 SS Phase Change OCN Marksman 
CoolingCoolingOSPower
2x old tek slims (GPU) Various watercooling stuff win7, winxp AX1200 
Case
test bench / cardboard box 
  hide details  
Reply
post #199 of 297
My 7850 games just fine. No studdering or frame skipping. I sense user error for most of the issues reported. wink.gif
post #200 of 297
Quote:
Originally Posted by *ka24e* View Post

My 7850 games just fine. No studdering or frame skipping. I sense user error for most of the issues reported. wink.gif

I figured you were being sarcastic, but I'll reply nonetheless...Not user error, but more user can't perceive imo. There seems to be a line drawn in the sand of those that can see stuttering/micro-stuttering and those that cannot. Only the die hard zealots are claiming there isn't an issue at all. The majority of us are questioning how many people can actually see the difference to the point that it affects their gameplay. I personally cannot on single card setups. I have cards from both camps running on the newest drivers and seriously am unable to see this difference. I'm not claiming there isn't a difference, as it is very clear there is one when you look at the data. I simply cannot see it.

The question is, how many are like myself and how many are like the ones that can perceive it to the point it ruins their gaming. I feel it has been blown out of proportion, but I may be in the minority. The ones that can see this difference may be a majority in fact. Either way, AMD does need to address the issue and at least give the people a slider option that allows adjustment. A slider that will let people like me keep the higher FPS/bad frame times and let others slide it to their desired balance of reduced overall frame rates vs lower frame time average. Hopefully that is what the decide to do, if it is even possible.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [PCPer] Frame Rating: GeForce GTX Titan, GeForce GTX 690, Radeon HD 7990 (HD 7970 CrossFire)