Overclock.net banner
Status
Not open for further replies.
1 - 20 of 73 Posts

· Registered
Joined
·
2,044 Posts
Discussion Starter · #1 ·
I might change mobos eventually for an sli capable board. Do ALL new games support SLI?
 

· Registered
Joined
·
4,199 Posts
Quote:
Originally Posted by venomblade;13031073
I might change mobos eventually for an sli capable board. Do ALL new games support SLI?
Sort of. Not natively at least. I been curious about this myself, and it seems that if a game doesn't support it, then you can make a game profile that will sort of "force it" to work.

But I don't know entirely. Even if you can get it to function, there could be bugs. I would love to know myself, gearing up for some awesome games and would like to use SLI but keep on playing my older games that might not "support" it.

Sent From My μATX That Could
 

· Premium Member
Joined
·
11,024 Posts
Quote:


Originally Posted by venomblade
View Post

I might change mobos eventually for an sli capable board. Do ALL new games support SLI?

Not all, but most games do. The exceptions are games ported to PC (even some ported games do). Most indie games don't support SLI either.
 

· Registered
Joined
·
1,853 Posts
SLI is great, during the summer months when it's really hot (I don't have AC) I can take one of my cards out and it greatly reduces the heat from my PSU and the extra video card, and I mostly play css anyway so I get no difference in fps. Even COD games support sli, I think it is sometimes a hassle with sli with getting it set up with individual games but its worth it.
 

· Registered
Joined
·
2,044 Posts
Discussion Starter · #6 ·
Quote:
Originally Posted by Booty Warrior;13031161
All of the blockbuster titles support multi-gpu setups (if not out of the gates, then usually shortly after launch).
slightly off topic, but i'd always wanted to ask you, what are your clocks? Want to see if your OC'ing ability is better than mine, since we both have palit
 

· Fortnite Fanatic
Joined
·
3,103 Posts
Quote:


Originally Posted by venomblade
View Post

slightly off topic, but i'd always wanted to ask you, what are your clocks? Want to see if your OC'ing ability is better than mine, since we both have palit

Get a room.

 

· Registered
Joined
·
2,869 Posts
Quote:


Originally Posted by kyle7412
View Post

SLI is great, during the summer months when it's really hot (I don't have AC) I can take one of my cards out and it greatly reduces the heat from my PSU and the extra video card, and I mostly play css anyway so I get no difference in fps. Even COD games support sli, I think it is sometimes a hassle with sli with getting it set up with individual games but its worth it.

How exactly is this a good thing? Lol
 

· Registered
Joined
·
1,853 Posts
Quote:


Originally Posted by Stefy
View Post

How exactly is this a good thing? Lol


well, performace wise it does me no good, but my gtx 460 runs very cool (even 2 of them runs cooler than a eg. gtx 580 and performs better...)

but a single gtx 460 runs all the games out there at pretty high settings with good frames. And one by itself for me idles at 20-25C and loads at 55-60C
 

· Registered
Joined
·
2,869 Posts
Quote:


Originally Posted by kyle7412
View Post

well, performace wise it does me no good, but my gtx 460 runs very cool (even 2 of them runs cooler than a eg. gtx 580 and performs better...)

but a single gtx 460 runs all the games out there at pretty high settings with good frames. And one by itself for me idles at 20-25C and loads at 55-60C

Hehe yeah, I get what you're saying.

I had to sig it, because taken out of its context it's a nice little sentence.

Nobrainer going for a non SLI/CF board nowadays imo.
 

· Premium Member
Joined
·
11,112 Posts
Honestly there are very few reasons to use a crossfire or sli configuration. Most people are running 1920x1080 or lower resolutions. Any highend card can run any game at these resolutions.

I have a 6950 flashed to a 6970 and game at 2560x1600. I play a smattering of new games and a mix of old games. I never have any problems with framerates and I typically max all in-game settings and add 2 or 4xAA, as at 2560x1600 you don't need much AA.

Anyway, if I'm not having problems running a single card at 2560x1600 then it is especially crazy that I see people with 1920x1080 panels running tri-sli gtx470s or something like that. People will sometimes say, "But Crysis", but that is the exception and even then at 1920x1200 I could run it almost maxed with a 4890.

So ask yourself this, "Self, what resolution will I be gaming at".

If the awnser is 1920x1080 or smaller then don't bother. Sometimes sli with two mid-range cards can represent more value, if that is why then sure, go for it. Otherwise any single highend card from whatever the current gen at the time of purchase is should be more than adequate.
 
  • Rep+
Reactions: HomeDepoSniper

· Registered
Joined
·
1,853 Posts
Quote:


Originally Posted by Stefy
View Post

Hehe yeah, I get what you're saying.

I had to sig it, because taken out of its context it's a nice little sentence.

Nobrainer going for a non SLI/CF board nowadays imo.

haha I feel honored to be in your sig
 

· Premium Member
Joined
·
14,323 Posts
If I've said it once, I've said it 1000 times ... there's no such thing as a 'game supporting SLi'. Youra6 ... I expected you to know this out of anyone on this thread


SLi support comes from the driver, not from the game. 99% of games have absolutely zero awareness of the whole concept of multi-gpu.

If the driver you're running has an SLi profile for the particular game ... then SLi is 'supported' for that game.

There's very few games from the last few years that nV hasn't made an SLi profile for (and many of the older more popular ones have profiles).

However sometimes new games (esp. demos prior to teh game release) come out and it takes a little while for nV to get around to making a profile available (either through a driver revision or an 'SLi enhancement' patch) ... which can kinda suck in the meanwhile. It took 'em over a month to make a SLi profile for 3dMark11 after it came out


And occasionally they never make a profile for some really obscure stuff. However a lot of times you can make SLi work in these cases just by renaming the .exe to something that does have SLi support ... best to try to rename it to something using the same engine though if possible.
 
  • Rep+
Reactions: Zackcy

· Banned
Joined
·
145 Posts
Unlike most the people on this thread, I am an SLI user. I however do NOT like it and would suggest others to stay away from it.

Not only do I have tons of issues with games that do or dont support sli. But then you have games that dont use sli very efficiently. Some games barely get 1 card off idle while the other is very warm.

You can force SLI on games that dont support it using the rendering, but then you get lower than usual performance or BSOD.

I forced alternate frame rendering in some games and got BELOW the average fps of one card.. just because its using both, doesnt mean its doing it right.

Now I am faced with problems of micro stuttering. Whenever I have 2 cards, my games seem to stutter. High/low fps. new or old games. warm or cold temps switching the cards. days of testing and tweaking.. for SLI? No thanks

Buy a better card, use the old one for physx and stop wasting power, time and energy with a system that has never been fixed/worked.

google micro stutter. and good luck. it will happen to you, eventually.
 

· RGB Numba Wan!
Joined
·
23,424 Posts
Games, very few unless you are running really high resolutions and/or multi monitors.

Benchmarks, yes...I am a benchmark whore
tongue.gif


Btw, sli has been a joy, super easy to configure, & totally geeky to play around with.
 

· Premium Member
Joined
·
14,323 Posts
Quote:
Originally Posted by FnkDctr;13036237
Unlike most the people on this thread, I am an SLI user. I however do NOT like it and would suggest others to stay away from it.

Not only do I have tons of issues with games that do or dont support sli. But then you have games that dont use sli very efficiently. Some games barely get 1 card off idle while the other is very warm.

You can force SLI on games that dont support it using the rendering, but then you get lower than usual performance or BSOD.

I forced alternate frame rendering in some games and got BELOW the average fps of one card.. just because its using both, doesnt mean its doing it right.

Now I am faced with problems of micro stuttering. Whenever I have 2 cards, my games seem to stutter. High/low fps. new or old games. warm or cold temps switching the cards. days of testing and tweaking.. for SLI? No thanks

Buy a better card, use the old one for physx and stop wasting power, time and energy with a system that has never been fixed/worked.

google micro stutter. and good luck. it will happen to you, eventually.
Not that you asked, but:
1) If one card doesn't come off of idle and the other is running much warmer, then SLI ISN'T working.
2) You cannot force SLI to work by 'using the rendering'. So that may be where your confusion is coming from. You think you're forcing SLI on ... but you're not. There HAS to be an SLi profile for the game in order for SLi to work, and there's nothing you can change in the settings of NVCP to 'force' SLi.
3) SLI already operates using alternate frame rendering by default.

What games are you talking about here where SLI doesn't work? I've been running SLi for years now, and there's only been a handful of games (and I play A LOT of games) that didn't have SLi when I bought them, and those that didn't (Prototype and FONV are the only one's I recall ... and Prototype it was only the Steam version that had the no-SLI issue) there was SLi support shortly thereafter. So either you're playing a lot of obscure/indie games, or you're not being patient after the more popular games come out.

Microstutter is a real phenomenon, however it only happens in some games + it's only perceptible at low FPS. So while in these cases a single card setup will appear smoother at 30fps vs. multi-gpu, once you get up into the ranges that one normally wants to play at (i.e. 60fps) it's effect becomes imperceptible.

Also, the operation of SLi (including reduction of microstutter) has gotten a lot better with the last couple of generations of cards
 

· Banned
Joined
·
145 Posts
Quote:
Originally Posted by brettjv;13038786
Not that you asked, but:
1) If one card doesn't come off of idle and the other is running much warmer, then SLI ISN'T working.
2) You cannot force SLI to work by 'using the rendering'. So that may be where your confusion is coming from. You think you're forcing SLI on ... but you're not. There HAS to be an SLi profile for the game in order for SLi to work, and there's nothing you can change in the settings of NVCP to 'force' SLi.
3) SLI already operates using alternate frame rendering by default.

What games are you talking about here where SLI doesn't work? I've been running SLi for years now, and there's only been a handful of games (and I play A LOT of games) that didn't have SLi when I bought them, and those that didn't (Prototype and FONV are the only one's I recall ... and Prototype it was only the Steam version that had the no-SLI issue) there was SLi support shortly thereafter. So either you're playing a lot of obscure/indie games, or you're not being patient after the more popular games come out.

Microstutter is a real phenomenon, however it only happens in some games + it's only perceptible at low FPS. So while in these cases a single card setup will appear smoother at 30fps vs. multi-gpu, once you get up into the ranges that one normally wants to play at (i.e. 60fps) it's effect becomes imperceptible.

Also, the operation of SLi (including reduction of microstutter) has gotten a lot better with the last couple of generations of cards
1000 reps and you dont know what you're talkin about. This will be fun.

1.Wrong. Go play sins of solar empire in sli and tell me both cards are used 100%. SLI does NOT mean that BOTH cards are used for the SAME PURPOSES. It only processed minute textures and detail. Go read more if you doubt me.

2.Wrong. Any rendering that is not native or designed into the software is forced. I guess this is new to you? Hold on this is my favorite line of the entire fail post of yours.

"There HAS to be an SLi profile for the game in order for SLi to work, and there's nothing you can change in the settings of NVCP to 'force' SLi."

LMAO When I find screenshots of me forcing SLI in Aion the DAY IT CAME OUT. Then I will post it and laugh in your face. Again. Change the render mode to try different types and you will see SLI temps increase on both cards and FPS nearly double. I dont need to prove my point, go read more.

3 is the only thing you got right. You are a genius! Who thought 2 cards would alternate rendering! give him a rep!

I play Crysis 1 and 2, all assassins creed, res evil 5, aa3, tf2, codmw2, dragon age 1 and 2, guild wars, mafia 1 and 2, nfs pro street, test drive unlimited, empire total war, and the list goes on of major blockbusters.

Just because it "supports" sli, doesnt mean both cards are being used fully, watch your temps. Batman never SLI even for me. Sins of Solar empire never will.

And as for micro stuttering? It happens at all fps, it has nothing to do with how many fps, its the timing gap in between the frames rendered.

I was just getting 400+ fps on guild wars and getting micro stutter. is that above 30?

And it doesnt happen on just some games some times, when it happens, it happens for almost all games.

Now i have to go make an nvidia profile for every game because I cheaped out and went sli? no thanks, unless im building a $5000 serious rig, no reason to ever dump money and time into sli again.

You are obviously new to SLI, go back and read my post, then feel free to prove anything different. Thanks.
 
1 - 20 of 73 Posts
Status
Not open for further replies.
Top