Overclock.net › Forums › Graphics Cards › NVIDIA › SLI using one 16x and a 8x expansion slot.
New Posts  All Forums:Forum Nav:

SLI using one 16x and a 8x expansion slot. - Page 2

post #11 of 18
Quote:
Originally Posted by JHex2 View Post
Do these rules apply to dual gpu cards too? Or does a card like GTX 295 need to be x16/x16?
Should still apply.
post #12 of 18
Thread Starter 
Quote:
Originally Posted by windfire View Post
^This.

To be more closer to your spec, below is a graph for GTX480 SLI for 1920x1200. The % performance loss is 2%.

As GTX480 SLI is more powerful than your GTX460 SLI and 1920x1200 is also more demanding than your 1080p, the expected performance loss should be max at 2%.


As to your problem of the 2nd card not recognized in the bottom PCIex8 slot, I suggest you try a single card installed at that slot and see if everything is ok. I assume you have done the obvious like cleaning all dust, if any, out of the slot...etc And I suppose you are using the same bridge (long enough?) which has been working ok.

If not, I suggest you flashing your motherboard to the latest BIOS (at least to FB which ''enchance PCIex16/x8 compatibility'') if you are still using the FA BIOS.
Link: http://www.gigabyte.com/products/pro...?pid=3449#bios

If even this does not work, then that slot is likely faulty. RMA it.
Yeah, the case is as clean as new aswell as the components, I will try the one card first and yes I am using the same bridge as i normally do as it does reach.
post #13 of 18
Thread Starter 
Okay well here are the results. I tested the card on the 8x slot, it worked. Then i tried SLI again with 16x/8x and again it did not recongize the second card. So i refered back to the manual booklet and it said that to use SLI i need to use te 16x/16x slots on the motherboard - then again they could have said that to make sure they are using maximum performance.

So any ideas?
post #14 of 18
If you get it to work. You won't notice a friggin difference.

I just read a test done on HardOCP I think where the lower spped slot actually beat the 2 x 16x slots sometimes - As in, 6 of one, 1/2 dozen of other.

I did a test with my 5770s - I did it with the 4x slot.

http://www.overclock.net/ati/773426-...ml#post9957829

So - even IF it's a 1 - 2% loss as others say - it won't matter a lick of difference to your eyes! GO FOR IT!
    
CPUMotherboardGraphicsRAM
AMD X3 720 BE BIOSTAR TA890FXE 2 x SAPPHIRE 5770 VAPOR-X 4GB Gskill 1600 9-9-9-24 
Hard DriveOSPowerCase
2 x WD Black 640 SATA6gbps Windows 7 Seasonic 750 Mosular ANTEC P183 
Mouse
Coolermaster Sentinal 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
AMD X3 720 BE BIOSTAR TA890FXE 2 x SAPPHIRE 5770 VAPOR-X 4GB Gskill 1600 9-9-9-24 
Hard DriveOSPowerCase
2 x WD Black 640 SATA6gbps Windows 7 Seasonic 750 Mosular ANTEC P183 
Mouse
Coolermaster Sentinal 
  hide details  
Reply
post #15 of 18
Thread Starter 
Quote:
Originally Posted by jonnyrockets View Post
If you get it to work. You won't notice a friggin difference.

I just read a test done on HardOCP I think where the lower spped slot actually beat the 2 x 16x slots sometimes - As in, 6 of one, 1/2 dozen of other.

I did a test with my 5770s - I did it with the 4x slot.

http://www.overclock.net/ati/773426-...ml#post9957829

So - even IF it's a 1 - 2% loss as others say - it won't matter a lick of difference to your eyes! GO FOR IT!
Yeah i wish I could get it to work, I really dont know whats wrong.
post #16 of 18
I have the same mobo as you and have my x16 and x8 slots populated. It runs fine in SLI. Reason I'm not using the x16/x16 is because my watercooling connector is too long so I needed to move my second card to the x8 slot. I'm not sure why your mobo isn't recognizing your second card.
Not Bad
(15 items)
 
  
CPUMotherboardGraphicsGraphics
i7 930 OC 4.0 GHZ @ 1.32V - XSPC Waterblock Gigabyte X58 - UD3R, Rev 2.0 Gigabyte GTX 470-Koolance Water Block Asus GTX 470-Koolance Water Block 
RAMHard DriveOptical DriveOS
16GB Mushkin 1600 Kingston 128 GB SSD/WD 1TB SATA III Asus 24X Windows 7 Premium - 64 bit 
MonitorKeyboardPowerCase
23" ASUS 120Hz (Nvidia 3D); 26" Samsung SyncMaster Razer X7 900W 80 Plus Silver Certified AZZA Full Tower 
MouseAudio
Razer Death Adder Logitech 2.1 Setup 
  hide details  
Reply
Not Bad
(15 items)
 
  
CPUMotherboardGraphicsGraphics
i7 930 OC 4.0 GHZ @ 1.32V - XSPC Waterblock Gigabyte X58 - UD3R, Rev 2.0 Gigabyte GTX 470-Koolance Water Block Asus GTX 470-Koolance Water Block 
RAMHard DriveOptical DriveOS
16GB Mushkin 1600 Kingston 128 GB SSD/WD 1TB SATA III Asus 24X Windows 7 Premium - 64 bit 
MonitorKeyboardPowerCase
23" ASUS 120Hz (Nvidia 3D); 26" Samsung SyncMaster Razer X7 900W 80 Plus Silver Certified AZZA Full Tower 
MouseAudio
Razer Death Adder Logitech 2.1 Setup 
  hide details  
Reply
post #17 of 18
Thread Starter 
Quote:
Originally Posted by Zmanster View Post
I have the same mobo as you and have my x16 and x8 slots populated. It runs fine in SLI. Reason I'm not using the x16/x16 is because my watercooling connector is too long so I needed to move my second card to the x8 slot. I'm not sure why your mobo isn't recognizing your second card.
It is strange, I have tried twice and still the same result. And im pretty sure there couldnt be anything in the BIOS that could effect this either, its a simple place in and connect up. The 8x slot that you are using, are you using the second one that is at the bottom of the board?
post #18 of 18
Thread Starter 
alright so i have found the problem, the clip on the expansion card slot of the 8x is not making the clipping noise when I push it in, part of the card is in and the other part is loose, is there any fix for this or am I going to have to RMA it?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
Overclock.net › Forums › Graphics Cards › NVIDIA › SLI using one 16x and a 8x expansion slot.