Overclock.net › Forums › Software, Programming and Coding › Networking & Security › The fastest way to connect 2 PC's...
New Posts  All Forums:Forum Nav:

The fastest way to connect 2 PC's...

post #1 of 7
Thread Starter 
Question guys,

I have a pretty serious 9 PC renderfarm and 1 server PC that has a nice 6 disk RAID 0 array to deliver the goods in time.

Now all these computers are connected through high quality (very expensive) CAT6 cables into a 16 port 32GB/sec backpane network switch.

Now my problem is that my PC's need a lot of data for their scenes, sometimes each node needs a 20GB dataset before it can render, that dataset has to be resend once I make 1 change. 20GB times 9 pc's can become quite taxing for the network.

My server is holding up easily and can output full 100 megabytes per second up and down simultaniously. The RAID array can handle 700MB/sec by the way. However I still find it's networking speed not enough.

My question to you awesome Network Pro's is is if there is a way to connect my PC's (maybe through daisychaining to eliminate the network switch) with a speed greater than what a normal Gigabit network can deliver?

Is there perhaps an even higher setting for an ethernet port to function on? And if there is no such thing what is the next best thing? Are there pci cards that have the ability to transfer information faster using any other kind of cable?

My Cat6 cables feel underpowered because the network speed NEVER drops below maximum Gigabit spec.

Reps for people who can think along smile.gif .
Phantom Farm
(9 items)
 
Mars
(13 items)
 
Zeus
(17 items)
 
CPUMotherboardGraphicsRAM
3x 950 @ 3800 3x Gigabyte 1366 UD-5 3x ATI 5870 3x 24GB ram 
Hard DriveCoolingOSPower
3x Spinpoint F1 3x Corsair H60 3x Windows 7 Ultimate 3x Corsair HX850 
Case
3x NZXT Phantom 
CPUMotherboardGraphicsRAM
i7 930 Gigabyte X58 UD-5 GTX 480 6x2 GB Dominator GT 1600mhz 
Hard DriveOSMonitorPower
2xSpinpoint F3 1TB OSX/Windows 7 Acer GD245 3D ready Corsair HX850 
Case
Nexus Edge 
CPUGraphicsRAMHard Drive
8x 3.0ghz Xeon X5472 GT 120 12GB Samsung Spinpoint F3 103sj 
Hard DriveHard DriveHard DriveHard Drive
2tb western digital 2tb western digital 2tb western digital 2tb western digital 
OSMonitorCase
OSX 2x 23" Cinema Display MacPro 
  hide details  
Reply
Phantom Farm
(9 items)
 
Mars
(13 items)
 
Zeus
(17 items)
 
CPUMotherboardGraphicsRAM
3x 950 @ 3800 3x Gigabyte 1366 UD-5 3x ATI 5870 3x 24GB ram 
Hard DriveCoolingOSPower
3x Spinpoint F1 3x Corsair H60 3x Windows 7 Ultimate 3x Corsair HX850 
Case
3x NZXT Phantom 
CPUMotherboardGraphicsRAM
i7 930 Gigabyte X58 UD-5 GTX 480 6x2 GB Dominator GT 1600mhz 
Hard DriveOSMonitorPower
2xSpinpoint F3 1TB OSX/Windows 7 Acer GD245 3D ready Corsair HX850 
Case
Nexus Edge 
CPUGraphicsRAMHard Drive
8x 3.0ghz Xeon X5472 GT 120 12GB Samsung Spinpoint F3 103sj 
Hard DriveHard DriveHard DriveHard Drive
2tb western digital 2tb western digital 2tb western digital 2tb western digital 
OSMonitorCase
OSX 2x 23" Cinema Display MacPro 
  hide details  
Reply
post #2 of 7
A few more details on your current hardware may be helpful, are you using 10GB network adaptors ?

dunx
Wee-PC
(17 items)
 
Wee black box
(13 items)
 
Boinc Box
(10 items)
 
CPUMotherboardGraphicsGraphics
i7 - 960 Asus P6T7 WS Revolution with MIPS water blocks. XFX HD 7950 + waterblock Powercolour R9 280X 
RAMHard DriveHard DriveOptical Drive
Crucial Ballistics Tracer 6 x 4Gb 2Tb WD caviar green Crucial M4 64 Gb SSD  Sony NEC DVD  
CoolingOSMonitorPower
Apogee Drive + Allphacool 180mm rad W7 Dell 29" 21:9 IPS monitor Coolermaster 1250W PSU 
CaseAudio
Antec P280 Scythe Kama Bay Speaker 
CPUMotherboardRAMHard Drive
i7-4770K Asus Impact VI HyperX beast 2400MHz 2x 8Gb WD 1Tb caviar black x3 
Hard DriveCoolingOSMonitor
Intel X-25 SSD 80 Gb Corsair H80 W7 Panasonic 32" HDTV 
KeyboardPowerCaseOther
Keysonic wireless Silverstone 450 W SFX PSU Fractal Design Array R2 Griffin Powermate 
CPUMotherboardGraphicsGraphics
i7 870  Asus Maximus III Formula GTX 480  HD 5870 
RAMHard DriveCoolingMonitor
OCZ 4x 2Gb  WD 2TB Green Corsair H70 Dell 17"  
PowerCase
Coolermaster Silent Pro 750W NZXT Panzerbox 
  hide details  
Reply
Wee-PC
(17 items)
 
Wee black box
(13 items)
 
Boinc Box
(10 items)
 
CPUMotherboardGraphicsGraphics
i7 - 960 Asus P6T7 WS Revolution with MIPS water blocks. XFX HD 7950 + waterblock Powercolour R9 280X 
RAMHard DriveHard DriveOptical Drive
Crucial Ballistics Tracer 6 x 4Gb 2Tb WD caviar green Crucial M4 64 Gb SSD  Sony NEC DVD  
CoolingOSMonitorPower
Apogee Drive + Allphacool 180mm rad W7 Dell 29" 21:9 IPS monitor Coolermaster 1250W PSU 
CaseAudio
Antec P280 Scythe Kama Bay Speaker 
CPUMotherboardRAMHard Drive
i7-4770K Asus Impact VI HyperX beast 2400MHz 2x 8Gb WD 1Tb caviar black x3 
Hard DriveCoolingOSMonitor
Intel X-25 SSD 80 Gb Corsair H80 W7 Panasonic 32" HDTV 
KeyboardPowerCaseOther
Keysonic wireless Silverstone 450 W SFX PSU Fractal Design Array R2 Griffin Powermate 
CPUMotherboardGraphicsGraphics
i7 870  Asus Maximus III Formula GTX 480  HD 5870 
RAMHard DriveCoolingMonitor
OCZ 4x 2Gb  WD 2TB Green Corsair H70 Dell 17"  
PowerCase
Coolermaster Silent Pro 750W NZXT Panzerbox 
  hide details  
Reply
post #3 of 7
Thread Starter 
Quote:
Originally Posted by 2002dunx View Post

A few more details on your current hardware may be helpful, are you using 10GB network adaptors ?
dunx

No sir I am using the onboard Gigabit ports. Each motherboard has 2 of them.

I am using the Gigabyte X58-UD5 for the slaves and server.

Is there such a thing as 10GB network adaptors?

Thanks in advance.
Phantom Farm
(9 items)
 
Mars
(13 items)
 
Zeus
(17 items)
 
CPUMotherboardGraphicsRAM
3x 950 @ 3800 3x Gigabyte 1366 UD-5 3x ATI 5870 3x 24GB ram 
Hard DriveCoolingOSPower
3x Spinpoint F1 3x Corsair H60 3x Windows 7 Ultimate 3x Corsair HX850 
Case
3x NZXT Phantom 
CPUMotherboardGraphicsRAM
i7 930 Gigabyte X58 UD-5 GTX 480 6x2 GB Dominator GT 1600mhz 
Hard DriveOSMonitorPower
2xSpinpoint F3 1TB OSX/Windows 7 Acer GD245 3D ready Corsair HX850 
Case
Nexus Edge 
CPUGraphicsRAMHard Drive
8x 3.0ghz Xeon X5472 GT 120 12GB Samsung Spinpoint F3 103sj 
Hard DriveHard DriveHard DriveHard Drive
2tb western digital 2tb western digital 2tb western digital 2tb western digital 
OSMonitorCase
OSX 2x 23" Cinema Display MacPro 
  hide details  
Reply
Phantom Farm
(9 items)
 
Mars
(13 items)
 
Zeus
(17 items)
 
CPUMotherboardGraphicsRAM
3x 950 @ 3800 3x Gigabyte 1366 UD-5 3x ATI 5870 3x 24GB ram 
Hard DriveCoolingOSPower
3x Spinpoint F1 3x Corsair H60 3x Windows 7 Ultimate 3x Corsair HX850 
Case
3x NZXT Phantom 
CPUMotherboardGraphicsRAM
i7 930 Gigabyte X58 UD-5 GTX 480 6x2 GB Dominator GT 1600mhz 
Hard DriveOSMonitorPower
2xSpinpoint F3 1TB OSX/Windows 7 Acer GD245 3D ready Corsair HX850 
Case
Nexus Edge 
CPUGraphicsRAMHard Drive
8x 3.0ghz Xeon X5472 GT 120 12GB Samsung Spinpoint F3 103sj 
Hard DriveHard DriveHard DriveHard Drive
2tb western digital 2tb western digital 2tb western digital 2tb western digital 
OSMonitorCase
OSX 2x 23" Cinema Display MacPro 
  hide details  
Reply
post #4 of 7
@OP

Your options are quite simple.

1) Channel bonding. This combines 2 or more Ethernet ports per machine so that the combined throughput is a percentage increase above that of a single port. While 2 GigE ports will not necessarily give you 200MB/sec, it should easily give you 150% of what you're getting now per node. You will need a) a switch that supports LACP and b) two decent Ethernet NICs per machine - I recommend a 2-port Intel server NIC per node.

2) 10Gb Ethernet. A single adapter will give ~930MB/sec. However, you will need a 10Gb Ethernet switch, and right now these are hideously expensive.

3) Infiniband. You will need Infiniband adapters, cables and an Infiniband switch. You should be able to get some bargains on eBay, but be warned: Infiniband is not as easy to set up as Ethernet. If you choose Infiniband, do your research before you spend any money.
Ryzen
(12 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 7 1700 Gigabyte GA-AB350M Gaming 3 Palit GT-430 Corsair Vengeance LPX CMK16GX4M2B3000C15 
Hard DriveCoolingOSMonitor
Samsung 850 EVO AMD Wraith Spire Linux Mint 18.x Dell UltraSharp U2414H 
KeyboardPowerCaseMouse
Apple Basic Keyboard Thermaltake ToughPower 850W Lian-Li PC-A04B Logitech Trackman Wheel 
  hide details  
Reply
Ryzen
(12 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 7 1700 Gigabyte GA-AB350M Gaming 3 Palit GT-430 Corsair Vengeance LPX CMK16GX4M2B3000C15 
Hard DriveCoolingOSMonitor
Samsung 850 EVO AMD Wraith Spire Linux Mint 18.x Dell UltraSharp U2414H 
KeyboardPowerCaseMouse
Apple Basic Keyboard Thermaltake ToughPower 850W Lian-Li PC-A04B Logitech Trackman Wheel 
  hide details  
Reply
post #5 of 7
Thread Starter 
Quote:
Originally Posted by parityboy View Post

@OP
Your options are quite simple.
1) Channel bonding. This combines 2 or more Ethernet ports per machine so that the combined throughput is a percentage increase above that of a single port. While 2 GigE ports will not necessarily give you 200MB/sec, it should easily give you 150% of what you're getting now per node. You will need a) a switch that supports LACP and b) two decent Ethernet NICs per machine - I recommend a 2-port Intel server NIC per node.
2) 10Gb Ethernet. A single adapter will give ~930MB/sec. However, you will need a 10Gb Ethernet switch, and right now these are hideously expensive.
3) Infiniband. You will need Infiniband adapters, cables and an Infiniband switch. You should be able to get some bargains on eBay, but be warned: Infiniband is not as easy to set up as Ethernet. If you choose Infiniband, do your research before you spend any money.

Thanks a LOT for the reply! This certainly helps.

While debating with myself over what to do, another option came to mind:

Outfit the Server with a load of PCI 4port gigabit network cards. The mikrotik ones run for $80 a piece and the UD5 can handle 4 of these which give me a total of 16 gigabit connectors on 1 Server.
rb44gv.jpg

I could then connect each node with 2 cables and have 150Mb/sec on each.

This solution will only cost me $320 without any additional costs and no need for a switch. In essence the Server becomes the switch.

This eliminates the bandwidth clogup when all machines get their jobs all at once and I utilize my RAID 0 to it's max.

Could this be done or am I totally wrong here?

Thanks again for the help!
Phantom Farm
(9 items)
 
Mars
(13 items)
 
Zeus
(17 items)
 
CPUMotherboardGraphicsRAM
3x 950 @ 3800 3x Gigabyte 1366 UD-5 3x ATI 5870 3x 24GB ram 
Hard DriveCoolingOSPower
3x Spinpoint F1 3x Corsair H60 3x Windows 7 Ultimate 3x Corsair HX850 
Case
3x NZXT Phantom 
CPUMotherboardGraphicsRAM
i7 930 Gigabyte X58 UD-5 GTX 480 6x2 GB Dominator GT 1600mhz 
Hard DriveOSMonitorPower
2xSpinpoint F3 1TB OSX/Windows 7 Acer GD245 3D ready Corsair HX850 
Case
Nexus Edge 
CPUGraphicsRAMHard Drive
8x 3.0ghz Xeon X5472 GT 120 12GB Samsung Spinpoint F3 103sj 
Hard DriveHard DriveHard DriveHard Drive
2tb western digital 2tb western digital 2tb western digital 2tb western digital 
OSMonitorCase
OSX 2x 23" Cinema Display MacPro 
  hide details  
Reply
Phantom Farm
(9 items)
 
Mars
(13 items)
 
Zeus
(17 items)
 
CPUMotherboardGraphicsRAM
3x 950 @ 3800 3x Gigabyte 1366 UD-5 3x ATI 5870 3x 24GB ram 
Hard DriveCoolingOSPower
3x Spinpoint F1 3x Corsair H60 3x Windows 7 Ultimate 3x Corsair HX850 
Case
3x NZXT Phantom 
CPUMotherboardGraphicsRAM
i7 930 Gigabyte X58 UD-5 GTX 480 6x2 GB Dominator GT 1600mhz 
Hard DriveOSMonitorPower
2xSpinpoint F3 1TB OSX/Windows 7 Acer GD245 3D ready Corsair HX850 
Case
Nexus Edge 
CPUGraphicsRAMHard Drive
8x 3.0ghz Xeon X5472 GT 120 12GB Samsung Spinpoint F3 103sj 
Hard DriveHard DriveHard DriveHard Drive
2tb western digital 2tb western digital 2tb western digital 2tb western digital 
OSMonitorCase
OSX 2x 23" Cinema Display MacPro 
  hide details  
Reply
post #6 of 7
Remember that the standard PCI bus (for the specific card you posted an image of) only has like 133 MB/sec.

I'd look up a gigabit switch that has a 10 gigabit uplink port on it.
Then at least your clients on gigabit links could concurrently pull more than 10 gbit/s

I'd be kind of surprised if you could find one in the few hundred dollar range, though.

As stated, alternatively you could get some PCIE dual/quad NICS and team those, but would also need a managed gigabit switch.
Quote:
20GB times 9 pc's can become quite taxing for the network.

You stated you already have enough backplane, it is simply too taxing for your single gigabit uplink of bandwidth tongue.gif
Waiting on X399
(13 items)
 
  
CPUMotherboardGraphicsRAM
AMD Phenom II B57 @ X4 3.9 Gigabyte 790FXTA-UD5 Sapphire Radeon 290 8 GB G.Skill 2133 
Hard DriveCoolingOSKeyboard
250 GB 840 EVO Noctua NH-D14 Windows 10 Logitech K350 
PowerCaseMouseMouse Pad
Seasonic x750 Corsair 600T Logitech G100s Razer Goliathus Speed 
Audio
Plantronics Gamecom 788 
  hide details  
Reply
Waiting on X399
(13 items)
 
  
CPUMotherboardGraphicsRAM
AMD Phenom II B57 @ X4 3.9 Gigabyte 790FXTA-UD5 Sapphire Radeon 290 8 GB G.Skill 2133 
Hard DriveCoolingOSKeyboard
250 GB 840 EVO Noctua NH-D14 Windows 10 Logitech K350 
PowerCaseMouseMouse Pad
Seasonic x750 Corsair 600T Logitech G100s Razer Goliathus Speed 
Audio
Plantronics Gamecom 788 
  hide details  
Reply
post #7 of 7
@OP

As beers stated, that card is limited by the PCI bus bandwidth. A PCI Express version would suit you better, but obviously will cost you more money.
Ryzen
(12 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 7 1700 Gigabyte GA-AB350M Gaming 3 Palit GT-430 Corsair Vengeance LPX CMK16GX4M2B3000C15 
Hard DriveCoolingOSMonitor
Samsung 850 EVO AMD Wraith Spire Linux Mint 18.x Dell UltraSharp U2414H 
KeyboardPowerCaseMouse
Apple Basic Keyboard Thermaltake ToughPower 850W Lian-Li PC-A04B Logitech Trackman Wheel 
  hide details  
Reply
Ryzen
(12 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 7 1700 Gigabyte GA-AB350M Gaming 3 Palit GT-430 Corsair Vengeance LPX CMK16GX4M2B3000C15 
Hard DriveCoolingOSMonitor
Samsung 850 EVO AMD Wraith Spire Linux Mint 18.x Dell UltraSharp U2414H 
KeyboardPowerCaseMouse
Apple Basic Keyboard Thermaltake ToughPower 850W Lian-Li PC-A04B Logitech Trackman Wheel 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Networking & Security
Overclock.net › Forums › Software, Programming and Coding › Networking & Security › The fastest way to connect 2 PC's...