After posting my wall mounted computer which uses PCIe extenders, I've got many questions regarding whether or not the extenders impact performance. So for your viewing pleasure, I have tested multiple extender configurations using my spare system.
The lovely people at 3M was very generous and sent me a sample of their very expensive PCIe extender assemblies! (Same ones I'm using on the wall mount rig)
Lets meet the contenders![PCIe Extenders][Test System]
- Processor: Intel Core 2 Quad Q6600 (Stock @ 2.4GHz)
- GPU: AMD Sapphire HD5870 1GB (Stock @ 850/1250Mhz)
- Memory: 2x4GB Mushkin Redlines DDR2-800
- Motherboard: Asus Commando
- Cooling: Freezer 13 Pro
- HDD: WD 640GB
- Power Supply: Corsair TX850M
- Monitor: LG Flatron W2442PA 1080p
- OS: Windows 7 Ultimate x64
- 3DMark - Cloud Gate
- 3DMark - Fire Strike
- Unigine - Heaven
- Unigine - Tropic
Final score will be the average of 3 benchmark runs.[Test Setup]In Slot3M Extender300mm + 200mm Extender300mm Extender200mm Extender[Result]
*Cloud Gate score average reduced by 1/3 to better match bar height with rest of the test columns
Only hiccup during testing was during the 300mm + 200mm extender test. The system BSOD at 2nd Heaven benchmark, BCCode: 124[Conclusion]
Based these benchmark results, I conclude that there is no measurable performance impacts when using PCI express extenders compared to in-slot configuration. In addition, there is no measurable performance impacts between different types and lengths of extenders used.
The shielded cable and unshielded cable was essentially identically in performance in this setup. However, 1 incident of BSOD was observed when using the unshielded cable while testing.[Remarks]
The unshielded extenders worked quite well in these tests. However, they will cause considerable trouble if they are near EMI sources or have additional riser cables on top of the other.
I fielded the question of why the regular cables didn't work for me in a dual GPU configuration and here is their response:
There are a number of contributors to problems. The first would be that with the cables that you are using, they are not impedance matched, and may not be low loss dielectric. Thus, attenuation, cross talk and reflections will close the eyes, due to loss and noise. Reflections are the result of impedance discontinuities. Our assemblies are tuned to 85 ohms, cable and PCB. PCIe Gen3 is 85 ohms. 85 and 100 ohms work interchangeably, but if mixed, you lose some signal to reflection and shorten the maximum possible length. If your performance is on the edge, that's not a good thing. Our cable is tightly coupled and uses silver plated conductors. Lower loss for conductivity, and for radiative losses coupling to ground. And we use a low loss dielectric. Impedance matching is one thing, but dielectrics absorb some of the energy too. The only thing separating these assemblies from what is used in supercomputers is the connector, and we have no control over that. The connectors are geared at being cheap and good enough, by Intel and the industry. By and large, they are.
As for crosstalk, our cable exhibits less than -40 db of crosstalk. Crosstalk can occur at the termination, since the shielding has been removed. We design the PCB termination and stackup to minimize the effects. Again, the other cable is not shielded, so there could be considerable cross talk along it's length. Between attenuation, crosstalk, and reflections there may be no eye at the far end. So for dual GPU, I think it is simply a cable performance issue. The other cable assembly was probably designed with PCIe Gen I in mind. I have seen some that advertise Gen 2, but the longest that I recall there was around 6 inches. And the prices were considerably higher than Gen I.
If you'd like to find out more on why extenders doesn't affect performance, there is a good discussion thread on Reddit regarding this issue.