Overclock.net › Forums › Overclockers Care › Overclock.net Folding@Home Team › Wouldn't it be nice to split workloads?
New Posts  All Forums:Forum Nav:

Wouldn't it be nice to split workloads?

post #1 of 33
Thread Starter 
i wish there was a way to split the workload of a bigadv unit over lets say 4 pcs

if you roughly take my current TPF which is 31mins 42secs and have that WU's load split over 4 x i7 rig with network or however, then you could get 215K PPD roughly . . no one in the process of haxing this into action?

just a thought, would be crazy.
post #2 of 33
I doubt you can, but if you have 4 i7 available to you thats 100k ppd which is great!
>.<
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 930 4.0ghz 1.27v EVGA E758 3-Way (black/gray) Evga GTX 480 / Evga 9800gtx+ (physx&folding) Corsair Dominator 6gb 1600 8-8-8-24 
Hard DriveOSMonitorPower
x25-M SSD 80gb + 1TB F3 + 2x2TB WD Green Win 7 64bit Viewsonic 20" + Samsung 40" Corsair 1000w 
Case
Haf 932 (modded) - Now caseless 
  hide details  
Reply
>.<
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 930 4.0ghz 1.27v EVGA E758 3-Way (black/gray) Evga GTX 480 / Evga 9800gtx+ (physx&folding) Corsair Dominator 6gb 1600 8-8-8-24 
Hard DriveOSMonitorPower
x25-M SSD 80gb + 1TB F3 + 2x2TB WD Green Win 7 64bit Viewsonic 20" + Samsung 40" Corsair 1000w 
Case
Haf 932 (modded) - Now caseless 
  hide details  
Reply
post #3 of 33
If it were possible that'd be aweome. The only way i know of that you can do that is either building a server with 4+cpus or cluster computing but im not sure hwo the later works
HappyPandaFace
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 930 4ghz 1.24v Asus P6T6 WS Revolution MSI GTX 470 800/1600/1900 1.0v G.Skill PIS 2200mhz 2x2GB 
Hard DriveOSMonitorPower
2x74Gb Raptors Raid 0, 1TB storage, 500Gb Backup Windows 7 Pro 2007WFP PC P&C 750w 
Case
CoolerMaster Stacker 832 
  hide details  
Reply
HappyPandaFace
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 930 4ghz 1.24v Asus P6T6 WS Revolution MSI GTX 470 800/1600/1900 1.0v G.Skill PIS 2200mhz 2x2GB 
Hard DriveOSMonitorPower
2x74Gb Raptors Raid 0, 1TB storage, 500Gb Backup Windows 7 Pro 2007WFP PC P&C 750w 
Case
CoolerMaster Stacker 832 
  hide details  
Reply
post #4 of 33
Thread Starter 
i want the source code for the linux smp client
post #5 of 33
That would be awesome.
post #6 of 33
Couldnt you run them in a Beowulf cluster?
 
Draco Volans II
(11 items)
 
 
CPUMotherboardGraphicsRAM
Phenom II x4 830 @4Ghz core /2.6Ghz NB Gigabyte MA790X-UD4P EVGA 970 2x4GB Ebay DDR2 at DDR762 5-5-5-15 
Hard DriveHard DriveHard DriveHard Drive
Seagate 500 GB 7200.12 (near dead) Hiachi Coolspin 2 TB (storage) (near dead) RAID0 2x Westren digital Caviar Blu... OCZ arc 100 240 GB SSD 
CoolingOSMonitorMonitor
Xiggy dark knight Windows 7 Ultimate x64 23" ASUS VH236H 1080p QINX QX2710LED 1440p 
KeyboardPowerCaseMouse
Kinesis advantage  XFX TS550 Antec 900 Razer deathadder 
Other
USB 3.0 PCIe card 
CPUMotherboardGraphicsRAM
Core i5-3210M HM77  GT 640LE  4GB soldered+ 8GB SO-DIMM 
Hard DriveOptical DriveCoolingOS
Samsung 840 EVO Slot loading Stock W7 x64 
MonitorKeyboardPower
1600x900 13.3" LCD Backlit 4200 mAh internal and 4800 mAh extended battery. 
CPUMotherboardGraphicsRAM
AMD Turion ML-40 with heatspreader attached HP laptop socket 754 motherboard with RS480M ch... IGP that I don't use 512 mb ddr 
Hard DriveOptical DriveCoolingOS
4 GB patriot USB 2.0 flash drive None, don't need it. Scythe Ninja Mounted with zip ties.  Pfsense 2.0.2 nanobsd 
OtherOtherOther
gigabit USB NIC Atheros WLAN NIC Many zip ties. 
  hide details  
Reply
 
Draco Volans II
(11 items)
 
 
CPUMotherboardGraphicsRAM
Phenom II x4 830 @4Ghz core /2.6Ghz NB Gigabyte MA790X-UD4P EVGA 970 2x4GB Ebay DDR2 at DDR762 5-5-5-15 
Hard DriveHard DriveHard DriveHard Drive
Seagate 500 GB 7200.12 (near dead) Hiachi Coolspin 2 TB (storage) (near dead) RAID0 2x Westren digital Caviar Blu... OCZ arc 100 240 GB SSD 
CoolingOSMonitorMonitor
Xiggy dark knight Windows 7 Ultimate x64 23" ASUS VH236H 1080p QINX QX2710LED 1440p 
KeyboardPowerCaseMouse
Kinesis advantage  XFX TS550 Antec 900 Razer deathadder 
Other
USB 3.0 PCIe card 
CPUMotherboardGraphicsRAM
Core i5-3210M HM77  GT 640LE  4GB soldered+ 8GB SO-DIMM 
Hard DriveOptical DriveCoolingOS
Samsung 840 EVO Slot loading Stock W7 x64 
MonitorKeyboardPower
1600x900 13.3" LCD Backlit 4200 mAh internal and 4800 mAh extended battery. 
CPUMotherboardGraphicsRAM
AMD Turion ML-40 with heatspreader attached HP laptop socket 754 motherboard with RS480M ch... IGP that I don't use 512 mb ddr 
Hard DriveOptical DriveCoolingOS
4 GB patriot USB 2.0 flash drive None, don't need it. Scythe Ninja Mounted with zip ties.  Pfsense 2.0.2 nanobsd 
OtherOtherOther
gigabit USB NIC Atheros WLAN NIC Many zip ties. 
  hide details  
Reply
post #7 of 33
Thread Starter 
ooooh might be possible. . .
post #8 of 33
Quote:
Originally Posted by DullBoi View Post
i wish there was a way to split the workload of a bigadv unit proccess over lets say 4 pcs
Fixed.. and there you have it: folding@home.

It is possible to split workloads among computers but you'd need to be able to rewrite the core most likely. But since sources of folding@home are not available (for obvious reasons) this is probably not something we'll be seeing soon.
15s
(13 items)
 
  
CPUMotherboardGraphicsRAM
Athlon II x4 640 Gigabyte 880GM-D2H AMD HD4250 4 GB 
Hard DriveOSMonitorPower
640GB Arch GNU/Linux i686 Fujitsu Siemens 4612 FA 350 W 
Mouse
Logitech G9x 
  hide details  
Reply
15s
(13 items)
 
  
CPUMotherboardGraphicsRAM
Athlon II x4 640 Gigabyte 880GM-D2H AMD HD4250 4 GB 
Hard DriveOSMonitorPower
640GB Arch GNU/Linux i686 Fujitsu Siemens 4612 FA 350 W 
Mouse
Logitech G9x 
  hide details  
Reply
post #9 of 33
Thread Starter 
k, im going to test LinuxPMI when i get home
post #10 of 33
Thread Starter 
since the WU is a process in the linux kernel im using on the VM, if i have two VM's with both the LinuxPMI patches. . darn. . .ive got guests tonight
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Overclock.net Folding@Home Team
Overclock.net › Forums › Overclockers Care › Overclock.net Folding@Home Team › Wouldn't it be nice to split workloads?