Originally Posted by jibesh
Originally Posted by tycoonbob
Oh I'm sure, lol. I'm thinking more like a ~3 ft cable, which still runs about $60. But the HBAs can also be found for ~$50, so it's really not a bad deal if you need the speed between two boxes. Throw in an IB switch and then you should start considering 10Gbit ethernet instead.
Can InfiniBand be deployed without a switch? or can HBAs be directly connected?
yes. You can even use dual port cards - so have a fileserver with a dual port card in it, connected to single port cards in your workstation and backup or vm server. Simple layout and cheap to implement.
Whether it's actually worth it for home use is another matter - 99% of the time you won't notice any difference unless you regularly do large sustained transfers and have the hardware at each end to support it all (no point being able to throw things across the network at 500MB/s if your storage can't keep up!). And if you do require large transfers then you might be better placed reconfiguring how your storage is set up in the first place, so you're keeping things where they're used.
Infiniband has it's place, but really that place is connecting SANs where the low latency really shines (10G ethernet is poor in comparison for latency). It's by no means trivial to set up and can cause annoying headaches where things stop working for no apparent reason (cheap cables are often a cause of problems). You also need a spare PCIe x8 slot for most cards, which can be hard to accommodate in all your systems alongside your graphics & RAID cards. That's why I've never bothered...