Overclock.net › Forums › Graphics Cards › NVIDIA › GTX 780 or GTX 660TI in SLI
New Posts  All Forums:Forum Nav:

GTX 780 or GTX 660TI in SLI - Page 4

Poll Results: 660TI SLI or GTX 780 or keep my 570 SLI setup.

 
  • 20% (27)
    660TI SLI
  • 66% (88)
    GTX 780
  • 12% (17)
    Keep my 570 SLI setup
132 Total Votes  
post #31 of 132
Or you can get two used 660 Tis for $220ish, pocket the rest. biggrin.gif
post #32 of 132
Man.. no doubt the new kepler is better but it costs more for the same or lower performance (if u intend to buy a nother card down the line go for the 700x if ur tight on the budget and need the performance now.. then go for the 660ti's), I have two ASUS 660ti's and I have to say once I had it setup it went through games & benchmarks like a machete through butter no issues no stuttering (Well.. I had it in one game but easily fixed with adaptive v-sync or v-sync + triple) like some people said.. (honestly if ur not clued up and don't know how to setup ur cards it's not the SLI.. u just need to tweak ur global 3d settings, once u do that.. ur good to go for the rest of ur life) SLI is not what it used to be.. "crap" I can't vouch for other cards but in personal exp. the 660ti scales very, very well! Not to mention u have smthing called nvidia inspector or evn nvidia experience that's made for all gamers alike.. it easy to use and can help u tweak settings for each game ur playing so u avoid unwanted problems (THAT ARE NOT ONLY SLI RELATED, every freaking time I read about a problem everyone want's to blame it on SLI, or performance issues due to some crappy unoptimized AA's or ubersampling or game map or whatever that eat GPU power evn on new single most extreme gpu's and is just not good evn for the most mainstream cards cuz of driver issues or the development of the game). I tested my 660ti's on a number of popular titles (games, benches) and I have to say I didn't even overclock them and got the same benches as the titan and better frames on 1080p in most games. I feel bad for all the people who bought SLI in the past/now.. and had a bad experience, SLI for me.. is way better and much much cheaper!

Conclusion: If ur a lazy guy with tons of cash to spare and want to future prof ur rig by buying a second down the road.. then slap that 700x to the motherboard, no questions asked tongue.gif (right) it can still have issues with some games like every other card.

If u have spare time, love to tweak, low on budget, want the same performance or better, with minor needs to tweaking before 100% or so usage/scaling and still want a decent future proof rig.. then the 660ti's are just that good!

For people who cry about low GPU usage on SLI:


Simple facts and solutions - global 3d settings
- DRIVERS, clean it up and install the latest or the best (depending on what ur using them for)!

- Change ur SLI rendering mode (force alternate frame rendering 2) or other (depends on the setup) check what works best for u.

- disable v-sync when ur benchmarking or playing games it will run 100% or close to it.

- The rough facts about v-sync turned on and tearing.. why ur SLI or single gpu usage is not 100% is because of v-sync and how it works.. let's say ur running high settings and ur frame output is 60 fps.. but it's constantly dropping and u still see low gpu usage.. why?.. because ur performance is constantly dropping bellow 60 frames the next hard cap on v-sync is 30 frames (or so).. if you've got enabled v-sync then the moment ur game drops bellow 60 frames it will automatically drop to the next level of framage which is 30 (evn if the gpu is capable of doing a bit more) and it will run at 30 frames (and if it can't handle 30 then it will drop to the next lower level.. and so on and so on..) tearing will most likely be visible at 30 frames if not.. then it will at least feel like it.. until the gpu is capable of running 60 or more! (this is also dependent on ur monitors refresh rate so v-sync on 60mhz is 60max frames v-sync on 120mhz is 120max,...., it also depends on certain settings and gfx card features like physx) cuz u don't need the extra GPU power to run on 30 frames the gpu usage will be low.. it's only when u go back up to it's v-sync cap of 60mhz + frames will it show normal usage.. and high or 100% or whatever u wanna call it gpu consumption when v-sync is off, will it show the max potentials and usage of the gfx card.. ( + ur first gpu will in most cases run higher % then the second)

- If it's not near 100% when v-sync is off then ur not putting enough stress on the gpu or ur not fully utilizing it's features. try adjusting ur msaa and aa settings that usually does the trick.

- to fix all of this tearing of frames u can simply use the adaptive v-sync option that has more framege level's and doesn't drop as hard as normal v-sync (like from 60 to 30 frames or so... it will be more like a 60 to 55 or smthing like that.. that's why it's called adaptive)

- if u still have problems and want to completely eliminated stuttering (excess of frames) in games u just use triple buffering (triple buffering only applies in OpenGL games & benchmarks. U can use smthing like RivaTuner to force it on Direct3D so u can effectively eliminate the problem (d3d is used mostly in today's games).

If the problem of stuttering or tearing is still visible in games.. I can't help u.. lower ur settings or try to use Mr. Google.
Edited by d3subz - 6/8/13 at 6:59am
post #33 of 132
Do you get 95% GPU scaling in BF3? Also, the stutter is there.

There's a reason the 780 goes for $650 and 2 660's go for $400...
post #34 of 132
Quote:
Originally Posted by d3subz View Post

Man.. no doubt the new kepler is better but it costs more for the same or lower performance (if u intend to buy a nother card down the line go for the 700x if ur tight on the budget and need the performance now.. then go for the 660ti's), I have two ASUS 660ti's and I have to say once I had it setup it went through games & benchmarks like a machete through butter no issues no stuttering (I had it in one game but easily fixed with adaptive v-sync or v-sync + triple) like some people said.. (honestly if ur not clued up and don't know how to setup ur cards it's not the SLI.. u just need to tweak ur global 3d settings, once u do that.. ur good to go for the rest of ur life) SLI is not what it used to be.. "crap" I can't vouch for other cards but in personal exp. the 660ti scales very, very well! Not to mention u have smthing called nvidia inspector or evn nvidia experience that's made for all gamers alike.. it easy to use and can help u tweak settings for each game ur playing so u avoid unwanted problems (THAT ARE NOT ONLY SLI RELATED, every freaking time I read about a problem everyone want's to blame it on SLI, or performance issues due to some crappy unoptimized AA's or ubersampling or game map or whatever that eat GPU power evn on new single most extreme gpu's and is just not good evn for the most mainstream cards cuz of driver issues or the development of the game). I tested my 660ti's on a number of popular titles (games, benches) and I have to say I didn't even overclock them and got the same benches as the titan and better frames on 1080p in most games. I feel bad for all the people who bought SLI in the past and now.. and had a bad experience, SLI for me.. is way better and much much cheaper!

Conclusion: If ur a lazy guy with tons of cash to spare and want to future prof ur rig by buying a second down the road.. then slap that 700x to the motherboard, no questions asked tongue.gif (right) it can still have issues with some games like every other card.

If u have spare time, love to tweak, is low on budget, want the same performance or better, with minor needs to tweaking before 100% or so usage/scaling and still want a decent future proof rig.. then the 660ti's are just that good!

For the people that cry about low GPU usage on SLI:

Simple facts and solutions - global 3d settings
- DRIVERS, clean it up and install the latest or the best (depending on what ur using them for)!

- Change ur SLI rendering mode (force alternate frame rendering 2) or other (depends on the setup) check what works best for u.

- disable v-sync when ur benchmarking or playing games it will run 100% or close to it.

- The rought facts about v-sync turned on and tearing.. why ur SLI or single gpu usage is not 100% is because of v-sync and how it works.. let's say ur running high settings and ur frame output is 60 fps.. but it's constantly dropping and u still see low gpu usage.. why?.. because ur performance is constantly dropping bellow 60 frames the next hard cap on v-sync is 30 frames (or so).. if you've got enabled v-sync then the moment ur game drops bellow 60 frames it will automatically drop to the next level of framage which is 30 (evn if the gpu is capable of doing a bit more) and it will run at 30 frames (and if it can't handle 30 then it will drop to the next lower level.. and so on and so on..) tearing will most likely be visible at 30 frames if not.. then it will at least feel like it.. untill the gpu is capable of running 60 or more! (this is also dependent on ur monitors refresh rate so v-sync on 60mhz is 60max frames v-sync on 120mhz is 120max,...., it also depends on certain settings and gfx card features like physx) cuz u don't need the extra GPU power to run on 30 frames the gpu usage will be low.. it's only when u go back up to it's v-sync cap of 60mhz + frames will it show normal usage.. and high or 100% or whatever u wanna call it gpu consumption when v-sync is off... will it show the max potentials and usage of the gfx card.. depending on the usage ( + ur first gpu will in most cases run higher % then the second)

If it's not near 100% when v-sync is off then ur not putting enough stress on the gpu or ur not fully utilizing it's features.

- to fix all of this tearing of frames u can simply use the adaptive v-sync option that has more framege level's and doesn't drop as hard as normal v-sync (like from 60 to 30 frames or so... it will be more like a 60 to 55 or smthing like that.. that's why it's called adaptive)

- if u want to completely eliminated stuttering (excess of frames) in games u just use triple buffering (triple buffering only applies in OpenGL games & benchmarks. U can use smthing like RivaTuner to force it on Direct3D so u can effectively eliminate the problem (d3d is used mostly in today's games).

If the problem of stuttering or tearing is still visible in games.. I can't help u.. lower ur settings or try to use Mr. Google.

Nice writeup, should help some people out
SKYnet
(17 items)
 
  
CPUMotherboardGraphicsGraphics
Intel I7 4930K @ 4.6ghz ASUS Rampage IV Extreme EVGA GTX 970 EVGA GTX 970 
RAMHard DriveHard DriveOptical Drive
G. Skill Ripjaws X 4x4gb 1.5tb WD Caviar Green SATA Samsung 840 EVO 250gb SSD HP USB DVD 
CoolingCoolingOSMonitor
Antec Kuhler 650 ThermalRight True Spirit 140 Power Windows 10 Ultimate 64 bit Samsung 240 HD TOC 
KeyboardPowerCaseMouse
Razor Lycosa PC Power and Cooling Silencer 910 NZXT Switch 810 Matte Black It clicks.............. 
Mouse Pad
IKEA $1 pad 
  hide details  
Reply
SKYnet
(17 items)
 
  
CPUMotherboardGraphicsGraphics
Intel I7 4930K @ 4.6ghz ASUS Rampage IV Extreme EVGA GTX 970 EVGA GTX 970 
RAMHard DriveHard DriveOptical Drive
G. Skill Ripjaws X 4x4gb 1.5tb WD Caviar Green SATA Samsung 840 EVO 250gb SSD HP USB DVD 
CoolingCoolingOSMonitor
Antec Kuhler 650 ThermalRight True Spirit 140 Power Windows 10 Ultimate 64 bit Samsung 240 HD TOC 
KeyboardPowerCaseMouse
Razor Lycosa PC Power and Cooling Silencer 910 NZXT Switch 810 Matte Black It clicks.............. 
Mouse Pad
IKEA $1 pad 
  hide details  
Reply
post #35 of 132
Ty JTHMfreak, Sory admin/op biggrin.gif I kinda got mad at single gpu users that don't use SLI effectively when I cursed them out a bit to much on my previous post smile.gif I'll behave biggrin.gif
post #36 of 132
Quote:
Originally Posted by d3subz View Post

Ty JTHMfreak, Sory admin/op biggrin.gif I kinda got mad at single gpu users that don't use SLI effectively when I cursed them out a bit to much on my previous post smile.gif I'll behave biggrin.gif

Some of them need to be ripped a new one though, you get too many people that don't have everything setup right and then go and give SLI a bad name. It's not like I have anything to gain by people using SLI or not, I just can't stand ignorant people.
SKYnet
(17 items)
 
  
CPUMotherboardGraphicsGraphics
Intel I7 4930K @ 4.6ghz ASUS Rampage IV Extreme EVGA GTX 970 EVGA GTX 970 
RAMHard DriveHard DriveOptical Drive
G. Skill Ripjaws X 4x4gb 1.5tb WD Caviar Green SATA Samsung 840 EVO 250gb SSD HP USB DVD 
CoolingCoolingOSMonitor
Antec Kuhler 650 ThermalRight True Spirit 140 Power Windows 10 Ultimate 64 bit Samsung 240 HD TOC 
KeyboardPowerCaseMouse
Razor Lycosa PC Power and Cooling Silencer 910 NZXT Switch 810 Matte Black It clicks.............. 
Mouse Pad
IKEA $1 pad 
  hide details  
Reply
SKYnet
(17 items)
 
  
CPUMotherboardGraphicsGraphics
Intel I7 4930K @ 4.6ghz ASUS Rampage IV Extreme EVGA GTX 970 EVGA GTX 970 
RAMHard DriveHard DriveOptical Drive
G. Skill Ripjaws X 4x4gb 1.5tb WD Caviar Green SATA Samsung 840 EVO 250gb SSD HP USB DVD 
CoolingCoolingOSMonitor
Antec Kuhler 650 ThermalRight True Spirit 140 Power Windows 10 Ultimate 64 bit Samsung 240 HD TOC 
KeyboardPowerCaseMouse
Razor Lycosa PC Power and Cooling Silencer 910 NZXT Switch 810 Matte Black It clicks.............. 
Mouse Pad
IKEA $1 pad 
  hide details  
Reply
post #37 of 132
Jodiuh Dunno what u mean by scaling.. scaling as in usage or scaling as SLI scaling.. but I think u mean GPU usage. I've just ran BF3 single player for the sake of how this game is poorly optimized, though I don't play this game, I've heard that's it just so so poorly optimized for SLI that it gives SLI a bad reputation dunno about single GPU's but I would think that they wouldn't be far behind if they didn't optimize them aswell. For instance I've run titles like: Crysis 3, Metro, Metro LL, Planet Side 2, Hawken, Tomb Raider, Dead Space 3, StarCraft 2, Devil May Cry DMC,Skyrim, Batman Arhkam City, The Witcher 2... and so on.. and they all ran smoothly with great frames high or decent gpu usage.. with the exception of the Metro Last Light benchmark that stuttered and I changed a couple of things in 3d settings and after that it worked fine

First off u have to understand BF3 out of the box is not a benchmark.. (it could later be optimized to be a benchmark) u will not see 100% usage all the time because the frame volatility is huge (v-sync off )from 40 to 200 on ultra maxed out.. so running smthing like adaptive v-sync + triple + lowering settings is highly recommended for smoother gameplay not to mention it's a D3D game I have not seen a game d3d using 100% all the time v-sync off. I've seen much more stable games then BF3..heck minesweeper is more stable smile.gif.. based on what I saw when testing is that they could have (SLI) rendering problems when v-sync is off.. or edge smoothing (aa's-smaa's) honestly OpenGL games/benchmarks give a much more accurate picture when stress testing ur card + it's not designed to stress ur card every second to it's max, some games are just not optimized for smthing like that.. when it comes to GPU usage and behavior a minor thing like drivers make a huge huge difference (not only for SLI but for single GPU's aswell, older drives might even work better with some games), different locations, maps, how it's programed how much voltage ur card get's ect, ect., + it's a kepler that's not ur average "let's set the max clock and run with it gfx card" (it's dynamic) and the min max boost clock's vary... there are so much variables to account for when it comes to % gpu usage and games that it's just silly to start

Most people only started talking about gpu usage when SLI's first appeared for the sake of performance scaling.. apparently everybody thinks their single core GPU somehow works 100% all the time smile.gif and if u have two cards and if it's not 100% then smthing is wrong.. that's just bull... and is so wrong in so many way's because it's exactly the same when it comes to a single gpu. The only dif. is how frames are rendered.. and is a totally wrong approach with new card's that have dynamic boosting when testing some d3d's games/benches. For instance I run Metro Last Light benchmark which is (directx or whatever u wanna call it) direct3d benchmark and I use like 65-85% gpu on max (if I lowered or tweaked my settings a bit I could hit 100% but that wouldn't be running max settings) and I know for a fact That I use close to 100% maybe in a to 2-4% margin MAX on unigine heaven that's OpenGL! And I know for a fact that single GPU users have the similar results.. why? I honestly don't know if someone can explain that it would be great.. but I can only assume that this is somehow either driver related, gpu frequency related because I've seen more volatility on d3d then on opengl, it could be how it's programed, it could be a bottleneck component on the gfx card itself, it could be due to the fact that when u change resolutions texture sized are dif. and ur bottleneced by the memory, the bus,.. it could be the cpu, motherboard (PCI bus) I can just assume all day long but overall in most cases I think it's a software to gpu related thing more then a hardware thing and how it's programed to react based on the gfx card structure itself and at what settings ur running the software... in 1 year ur 660ti will get 20 frames more on an updated d3d benchmark and u will wonder how dat happened, how indeed tongue.gif

Now.. the simple test of what a poorly optimized game looks like on 660ti SLI

(maxed it out - ultra settings 1080p)

TEST 1:

obvious changes to 3d global settings:
v-sync off : deliberately stressed my gpu with quick gameplay and mouse movements)
SLI rendering : force alternate frame rendering 2

min max fps 40-200 while moving
min max usage of both cards 55-75% while moving <-- obviously lower gpu usage then the MLL benchmark (it's crappy programing/rendering/drives it could be that my two gfx cards are not powerful enough to run on those settings... pick your fruit)

other observations: low/high clocks, volatile clocks
no stuttering
tearing obvious when it drops around 40

min max fps 200 while standing still
min max usage of both cards 93-96% while standing still

TEST 2:

adaptive v-sync on
SLI rendering : force alternate frame rendering 2

min max fps 40-60 while moving (due to my 60mhz monitor)
clock
min max fps 60 while standing

a bit smoother game play but nothing significant

gpu usage on both moving and standing was more or less 40-50% (on both cards)
low/high clocks volatility is the same

IMPORTANT EDIT: (fix BF3 low GPU usage)

TEST 3


When going from 4XMSAA to 2XMSAA it drastically increased my performance the GPU's were both on 95% or more load and I got an average of 100-120frames with spikes up to 150-180 with great gameplay and 0 obvious stuttering and tearing and this is without adaptive v-sync or anything added if u still have multiplayer issues read my first post and play around with ur global 3d settings.

Solution: for me if I wanted to play BF3 as smooth as possible (glad i don't) : The most logical thing would be lowering my settings so that I would get constant 60fps on every map and every situation (but this is just based on one map and it could be like min 60frames on the other). I could also disable SLI and see what happens tongue.gif

Example of poor optimization in Crysis 3: Where the first map gives really bad frames.. others are just great.. and this is single or multi-gpu related.

Conclusion: This is just funny biggrin.gif 40min frames I get 60 in Crysis 3 maxed out lol, so u see that some games are just not that greatly optimized neither for single and especially not for multi gpu's and the problems are different.. with different bottlenecks either software or hardware but most likely it's the software ur trying to run. People will say go for single GPU.. ok I would agree to some degree but from my personal experience I think that both radeon and nvidia are getting better with drivers and it's usually the game designers and engine developers that aren't maxing it out. While most bug's are single or multi gpu related.. I admit there are some cases like BF3 that are just bad when it comes to SLI, but other then that I have not encountered any other 2012/2013 game that would be bad in SLI (I play Unreal Tournament (1999) works great and it's 15 years old, there are some talks about borderlands 2 but other then that.. the problems are mostly isolated + these 2 games suck anyway tongue.gif and if u can live with 4xmsaa u can play the game just fine.
Edited by d3subz - 6/8/13 at 7:10am
post #38 of 132
Quote:
Originally Posted by JTHMfreak View Post

Some of them need to be ripped a new one though, you get too many people that don't have everything setup right and then go and give SLI a bad name. It's not like I have anything to gain by people using SLI or not, I just can't stand ignorant people.

Indeed smile.gif I'm writing novels so people could understand that these are isolated incidents and are really a poor excuse not to go SLI.. but if I had SLI problems on my favorite game.. I would be pissed as well.. glad I'm not smile.gif
post #39 of 132
Quote:
Originally Posted by d3subz View Post

other then that I have not encountered any other 2012/2013 game that would be bad in SLI (I play Unreal Tournament (1999) works great and it's 15 years old, there are some talks about borderlands 2 but other then that.. the problems are mostly isolated + these 2 games suck anyway tongue.gif

Love me some Unreal Tournament, 2004 was my favorite I think, UT3 didn't really grab me though. But yes, like you I have no issues with a much older game when it comes to SLI. The biggest problem with BL2 is the completely over done over the top physx that brings gpus to their knees. To properly SLI on BL2 you need to dedicate a card to physx or else you get 20ish fps which is horribly lame. When I had my 480s I had to use one as physx otherwise the game was unplayable with physx on. Heck even a single 670 ftw had issues with that game with physx on high.
I have not played BF3 since my SLI 480s though, but at the time had no issues with the game, I more or less just used BF3 as a comparison tool. I also did not exactly play the game for more than a few hours with my 480s I eventually became bored and I more or less just found a spot on the map that put my gpu usage at 99% and let it sit there for a minute or two just to check for a cpu bottleneck.
SKYnet
(17 items)
 
  
CPUMotherboardGraphicsGraphics
Intel I7 4930K @ 4.6ghz ASUS Rampage IV Extreme EVGA GTX 970 EVGA GTX 970 
RAMHard DriveHard DriveOptical Drive
G. Skill Ripjaws X 4x4gb 1.5tb WD Caviar Green SATA Samsung 840 EVO 250gb SSD HP USB DVD 
CoolingCoolingOSMonitor
Antec Kuhler 650 ThermalRight True Spirit 140 Power Windows 10 Ultimate 64 bit Samsung 240 HD TOC 
KeyboardPowerCaseMouse
Razor Lycosa PC Power and Cooling Silencer 910 NZXT Switch 810 Matte Black It clicks.............. 
Mouse Pad
IKEA $1 pad 
  hide details  
Reply
SKYnet
(17 items)
 
  
CPUMotherboardGraphicsGraphics
Intel I7 4930K @ 4.6ghz ASUS Rampage IV Extreme EVGA GTX 970 EVGA GTX 970 
RAMHard DriveHard DriveOptical Drive
G. Skill Ripjaws X 4x4gb 1.5tb WD Caviar Green SATA Samsung 840 EVO 250gb SSD HP USB DVD 
CoolingCoolingOSMonitor
Antec Kuhler 650 ThermalRight True Spirit 140 Power Windows 10 Ultimate 64 bit Samsung 240 HD TOC 
KeyboardPowerCaseMouse
Razor Lycosa PC Power and Cooling Silencer 910 NZXT Switch 810 Matte Black It clicks.............. 
Mouse Pad
IKEA $1 pad 
  hide details  
Reply
post #40 of 132
Quote:
Originally Posted by JTHMfreak View Post

Some of them need to be ripped a new one though, you get too many people that don't have everything setup right and then go and give SLI a bad name. It's not like I have anything to gain by people using SLI or not, I just can't stand ignorant people.

Me either biggrin.gif

It sounds like a lot of you who say SLI is perfect are judging these games with your eyes and not with any monitoring software running along with it. I think it really boils down to what is acceptable and not acceptable on an individual basis when you're in a game. There are lots of people who are perfectly fine getting 30fps in a multiplayer fast-paced first person shooter and that is great for them. If your GPU's are running at 99% or 10%, if they're getting 30fps they are happy.

There are others who adjust their settings so the minimum frame per second is never below 60, and some like myself who use a 144hz monitor want their minimum to be 100fps. If my goal is minimum 100fps, and my graphical detail is set to low across BF3, and while in game my CPU usage is 70%, GPU usage is vacillating between 40 and 70%, and I'm dipping into the 50's and 60's with frames, yeah I'm going to blame poor scaling with SLI. And before you say I haven't tinkered with all the different v-sync types, or increase my detail settings to put more load on the GPU's, or any other ways to tweak BF3, you're mistaken. I quit enjoying BF3 multiplayer in order to figure this problem out and all my research pointed to poor scaling. (I even reformatted my system in an effort to cross every possibility off my list!)

Here are a few recent threads of people discussing low GPU usage in BF3 with SLI on this forum alone:

http://www.overclock.net/t/1377728/low-gpu-usage-in-bf3-multiplayer/0_100

http://www.overclock.net/t/1346071/40-gpu-usage-on-my-gtx-680-sli/0_100

http://www.overclock.net/t/1326623/gtx-680-sli-70-usage/0_100

http://www.overclock.net/t/1354405/sli-670-ftws-stutter-and-low-gpu-usage/0_100

http://www.overclock.net/t/1356898/battlefield-3-sli-gpu-usage-issue/0_100

If turning on adaptive v-sync or making sure you properly deleted your old video card drivers and are using the latest (which is the default answer you get) did in fact fix the problem, there wouldn't be this debate still.

Some games work great with SLI, others don't. I think everyone can agree to that.
Edited by tlbig10 - 6/7/13 at 2:33pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
Overclock.net › Forums › Graphics Cards › NVIDIA › GTX 780 or GTX 660TI in SLI