Overclock.net › Forums › Specialty Builds › Servers › Where to buy blade chassis?
New Posts  All Forums:Forum Nav:

Where to buy blade chassis? - Page 2

post #11 of 27
Quote:
Originally Posted by kweechy View Post
For whatever reason, I was under the impression that 'blades' were simply high density computing installations whose goal is high performance applications and processing speed (as opposed to a rack of servers dedicated to hosting VMWare) for things like data processing. Didn't think it mattered what the case designs were like or who made them...long as you had as much computing power possible per square inch.

Our render farm machines at the studio aren't OC'd of course because that's pretty much impossible in a high density setting like that, however for the scale of what I'm doing, I could pull it off I think. Doubt I'd go higher than 10 machines for personal work.

The chassis they have here in our farm aren't really 1U which is why I'm asking about these things. You know how blade enclosures might total 24U, but internally they are subdivided in non-1U divisions? It's sort of like that. Each of these trays isn't much wider than the motherboard itself.

For clarification though and to avoid looking foolish in the future...are blades not simply a set of high performance and high density computers? I don't understand what happens between a tiny chassis holding high power CPUs and the offerings from Dell/HP that makes them into blades and the former into...well whatever you'd call it.
Blades are specific designs where the entire system is a plug in "blade". There is an entire infrastructure required for them.




Everything else is basically just rackmount or standalones.



With overclocking servers..... have you considered the cost-benefit?

i.e. 20% faster time... but greater risk of failure that might wipe out days worth of processing?
Edited by DuckieHo - 6/7/11 at 8:08am
Once again...
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 920 [4.28GHz, HT] Asus P6T + Broadcom NetXtreme II VisionTek HD5850 [900/1200] + Galaxy GT240 2x4GB G.Skill Ripjaw X [1632 MHz] 
Hard DriveOSMonitorKeyboard
Intel X25-M 160GB + 3xRAID0 500GB 7200.12 Window 7 Pro 64 Acer H243H + Samsung 226BW XARMOR-U9BL  
PowerCaseMouseMouse Pad
Antec Truepower New 750W Li Lian PC-V2100 [10x120mm fans] Logitech G9 X-Trac Pro 
  hide details  
Reply
Once again...
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 920 [4.28GHz, HT] Asus P6T + Broadcom NetXtreme II VisionTek HD5850 [900/1200] + Galaxy GT240 2x4GB G.Skill Ripjaw X [1632 MHz] 
Hard DriveOSMonitorKeyboard
Intel X25-M 160GB + 3xRAID0 500GB 7200.12 Window 7 Pro 64 Acer H243H + Samsung 226BW XARMOR-U9BL  
PowerCaseMouseMouse Pad
Antec Truepower New 750W Li Lian PC-V2100 [10x120mm fans] Logitech G9 X-Trac Pro 
  hide details  
Reply
post #12 of 27
A blade enclosure will typically occupy 5 to 8 U of rack space, but allow for the insertion of 10-20 'blades' which, as Duckie points out, are highly proprietary pieces of hardware.

If you want to pack a decent, overclocked farm into a rack, I recommend 2-3 U server enclosures, inserted into a portable (or otherwise manageable) short rack

Bonus points for dedicating some of the rack space to a combined watercooling loop for the systems.
    
CPUMotherboardGraphicsRAM
Xeon E5645 Asus Rampage II Gene eVGA GTX460 SSC 3x4GB Corsair Dominator PC3-12800 
Hard DriveHard DriveHard DriveOS
Crucial C300 64GB Seagate Momentus XT 750 Western Digital GP 2TB EARS Windows 7 Premium 
MonitorKeyboardPowerCase
Asus VE278Q DSI SMK-88 Seasonic X-660 Lian Li U6B 2011 SE 
MouseMouse Pad
Logitech G500 KAI.g3 HIEN - HARD 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Xeon E5645 Asus Rampage II Gene eVGA GTX460 SSC 3x4GB Corsair Dominator PC3-12800 
Hard DriveHard DriveHard DriveOS
Crucial C300 64GB Seagate Momentus XT 750 Western Digital GP 2TB EARS Windows 7 Premium 
MonitorKeyboardPowerCase
Asus VE278Q DSI SMK-88 Seasonic X-660 Lian Li U6B 2011 SE 
MouseMouse Pad
Logitech G500 KAI.g3 HIEN - HARD 
  hide details  
Reply
post #13 of 27
Quote:
Originally Posted by kweechy View Post
For whatever reason, I was under the impression that 'blades' were simply high density computing installations whose goal is high performance applications and processing speed (as opposed to a rack of servers dedicated to hosting VMWare) for things like data processing. Didn't think it mattered what the case designs were like or who made them...long as you had as much computing power possible per square inch.
Well you are about 50/50 on that. Blade enclsures are all about cramming the mose power effeciently in a small space for enterprise solutions. Some of the new half height single slot blades now hold up to 4 X5600 series xeons. These blades are actuall 2 blades in one. Say like the HP BL2 x 220. In short though, blades make for a good small space footprint to run a good small business.

To me there is a lot of wasted space in a blade enclosure. You can have anywhere from 8-12 huge long power supplies, 10 turbine noisy as hell fans, then the whole back of a blade enclosure is a waste for a user like us. The back of an HP c7000 enclosure houses your iLO card/KVM (for remote management and requires a license), switches, fibre channels, sas cards and your OA card (which controls your blade enclosure and power to all the periphials).

In short IF you can find a STEAL of a deal on a say like HP c3000 HP c7000 enclosure then jump on it, and invest in some older G5/G6 blades. However, in my honest opinion you would be better off building cheap overclockable 2U servers.

Deeeebs out...

EDIT: Just to give you an idea. This is what im working with... Aslo the big C7000 enclosure is large enough for a Labrador to sleep inside if it was a dog house...


Edited by Deeeebs - 6/8/11 at 6:19am
Red Anarchy
(15 items)
 
Commander Herbie
(15 items)
 
BlackBox
(18 items)
 
CPUMotherboardGraphicsRAM
Intel Xeon X5675 @ 4.2/1.352v Asus x58 Sabertooth NVIDIA Quadro FX4800 12GB Mushkin Radioactive 9-9-9-24 
Hard DriveHard DriveHard DriveOptical Drive
Samsung 32GB SSD (ESXi + ISO Images) Western Digital 250GB 7200 Sata (VMs) 3 x 750GB Segate Con ES Raid 5 none 
OSMonitorKeyboardPower
VMware ESXi 5.0 Dual Dell 17" Flat panel Cheap Logitech Seasonic X750 Gold 
CaseMouseMouse Pad
Antec 900 Cheap Logitech HP Blackbird 002 Gaming Pad 
  hide details  
Reply
Red Anarchy
(15 items)
 
Commander Herbie
(15 items)
 
BlackBox
(18 items)
 
CPUMotherboardGraphicsRAM
Intel Xeon X5675 @ 4.2/1.352v Asus x58 Sabertooth NVIDIA Quadro FX4800 12GB Mushkin Radioactive 9-9-9-24 
Hard DriveHard DriveHard DriveOptical Drive
Samsung 32GB SSD (ESXi + ISO Images) Western Digital 250GB 7200 Sata (VMs) 3 x 750GB Segate Con ES Raid 5 none 
OSMonitorKeyboardPower
VMware ESXi 5.0 Dual Dell 17" Flat panel Cheap Logitech Seasonic X750 Gold 
CaseMouseMouse Pad
Antec 900 Cheap Logitech HP Blackbird 002 Gaming Pad 
  hide details  
Reply
post #14 of 27
Not to mention the fact that the C class enclosures weigh a ton empty!

Yea the blades are more about maximizing space savings, while providing better manageability compared to 10 or more systems. Obviously they are great for a shared datacenter, or a small datacenter where rack space is at a premium. However for a home user that sort of thing isn't a big deal.

You could go the route that some home-rendering folks did and just assemble a farm out of file cabinets and cheap ATX systems. You could easily build 2 or more quad core machines for the price of just one blade/1U server. The quality and reliability aren't the same, but it is a viable option.
    
CPUMotherboardGraphicsRAM
C2D T7100 1.8 ghz (undervolted) ummm... Dell Intel X3100 2 x 1gb 667mhz 
Hard DriveOptical DriveOSMonitor
Fujitsu 7200 RPM 120gb CD-RW/DVD dual boot Vista business 1440x900 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
C2D T7100 1.8 ghz (undervolted) ummm... Dell Intel X3100 2 x 1gb 667mhz 
Hard DriveOptical DriveOSMonitor
Fujitsu 7200 RPM 120gb CD-RW/DVD dual boot Vista business 1440x900 
  hide details  
Reply
post #15 of 27
Quote:
Originally Posted by trueg50 View Post
You could go the route that some home-rendering folks did and just assemble a farm out of file cabinets and cheap ATX systems. You could easily build 2 or more quad core machines for the price of just one blade/1U server. The quality and reliability aren't the same, but it is a viable option.
Helmer
post #16 of 27
Quote:
Originally Posted by the_beast View Post
Yup, that would be the one I was thinking of. There were a few others that were a bit more professionally done, but it is quite possible to get a powerhouse cluster for a good price; you just need to do your research, have some ingenuity, and some patience.
    
CPUMotherboardGraphicsRAM
C2D T7100 1.8 ghz (undervolted) ummm... Dell Intel X3100 2 x 1gb 667mhz 
Hard DriveOptical DriveOSMonitor
Fujitsu 7200 RPM 120gb CD-RW/DVD dual boot Vista business 1440x900 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
C2D T7100 1.8 ghz (undervolted) ummm... Dell Intel X3100 2 x 1gb 667mhz 
Hard DriveOptical DriveOSMonitor
Fujitsu 7200 RPM 120gb CD-RW/DVD dual boot Vista business 1440x900 
  hide details  
Reply
post #17 of 27
You could also look into something like This

Just released, don't know pricing info, but looks to be a great way to cram some compute power into a small space.

I've also had great luck with the SuperMicro bladecenters, and they're pretty reasonable. I saw one on Ebay fully loaded (4 PSU, 8 blades, dual proc each blade, 16GB RAM each blade, two ethernet switches, and KVM) go for just over $12,000.
My System
(13 items)
 
   
CPUMotherboardGraphicsRAM
Intel Core2 Quad Q6600 Asus Maximus Formula Zotac Geforce 8800GT 8GB Kingston DDR2800 
Hard DriveOptical DriveOSMonitor
80GB Intel X25M G2 Ben-Q DVD+-RW Windows 7 Ultimate Dual Dell 19" Trinitrons + 1 Hanns-G 19" LCD 
KeyboardPowerCaseMouse
MS Natural Multimedia Rosewill RX750-D-B Antec 900 Razer Diamondback 
Mouse Pad
Razer eXactMat 
CPUCPUMotherboardRAM
Intel Xeon Intel Xeon 440BX Desktop Reference Platform Kingston ValueRam - 72GB 
Hard DriveHard DriveHard DriveOS
Seagate Barracuda ES.2 Hitachi HUS724040ale640 Micron M500DC VMware ESXi 6.0 
Other
Synology ds2015xs 
  hide details  
Reply
My System
(13 items)
 
   
CPUMotherboardGraphicsRAM
Intel Core2 Quad Q6600 Asus Maximus Formula Zotac Geforce 8800GT 8GB Kingston DDR2800 
Hard DriveOptical DriveOSMonitor
80GB Intel X25M G2 Ben-Q DVD+-RW Windows 7 Ultimate Dual Dell 19" Trinitrons + 1 Hanns-G 19" LCD 
KeyboardPowerCaseMouse
MS Natural Multimedia Rosewill RX750-D-B Antec 900 Razer Diamondback 
Mouse Pad
Razer eXactMat 
CPUCPUMotherboardRAM
Intel Xeon Intel Xeon 440BX Desktop Reference Platform Kingston ValueRam - 72GB 
Hard DriveHard DriveHard DriveOS
Seagate Barracuda ES.2 Hitachi HUS724040ale640 Micron M500DC VMware ESXi 6.0 
Other
Synology ds2015xs 
  hide details  
Reply
post #18 of 27
Quote:
Originally Posted by DuckieHo View Post
i.e. 20% faster time... but greater risk of failure that might wipe out days worth of processing?
Pretty sure Bucket Rendering doesn't work like that, as each frame is chopped up into bits. The final gather/soft lighting or whatever you want to call is is calculated for the whole frame, but the models/refraction/shadowing is calculated in bits and pieces.

At least that's my understanding of it. I've though about dabbling with distributed/bucket rendering, but I couldn't justify it because I hardly render anymore.
    
CPUMotherboardGraphicsRAM
Q6600 SLACR @ 3.6 GHz Asus P5E Deluxe MSI 6950 2 GB + 9800GT (PhysX) 4 GB White Lake DDR2-800 
Hard DriveOptical DriveOSMonitor
Hitachi 500 GB Sata iHas 120 Windows 7 Pro x64 u2711 (27", 2560x1440, H-IPS) 
KeyboardPowerCaseMouse
Generic Dell Combat Power 750W Aerotech PGS Bx-500 Logitech Rx300 
Mouse Pad
Desk 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Q6600 SLACR @ 3.6 GHz Asus P5E Deluxe MSI 6950 2 GB + 9800GT (PhysX) 4 GB White Lake DDR2-800 
Hard DriveOptical DriveOSMonitor
Hitachi 500 GB Sata iHas 120 Windows 7 Pro x64 u2711 (27", 2560x1440, H-IPS) 
KeyboardPowerCaseMouse
Generic Dell Combat Power 750W Aerotech PGS Bx-500 Logitech Rx300 
Mouse Pad
Desk 
  hide details  
Reply
post #19 of 27
Quote:
Originally Posted by mbreitba View Post
You could also look into something like This

Just released, don't know pricing info, but looks to be a great way to cram some compute power into a small space.
Interesting mix of blade server and standalone system configurations. Independent systems and networking, but shares the PSU.

The microblades however are only single socket, and they only support low-end Xeon's. But the power costs would be less (one or two PSU's, vs 8), and you won't have to worry about managing 8 systems. Costs would be the real problem, though Super Micro systems are usually pretty reasonable.

What is your budget?
    
CPUMotherboardGraphicsRAM
C2D T7100 1.8 ghz (undervolted) ummm... Dell Intel X3100 2 x 1gb 667mhz 
Hard DriveOptical DriveOSMonitor
Fujitsu 7200 RPM 120gb CD-RW/DVD dual boot Vista business 1440x900 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
C2D T7100 1.8 ghz (undervolted) ummm... Dell Intel X3100 2 x 1gb 667mhz 
Hard DriveOptical DriveOSMonitor
Fujitsu 7200 RPM 120gb CD-RW/DVD dual boot Vista business 1440x900 
  hide details  
Reply
post #20 of 27
Thread Starter 
Quote:
Originally Posted by trueg50 View Post
Not to mention the fact that the C class enclosures weigh a ton empty!

Yea the blades are more about maximizing space savings, while providing better manageability compared to 10 or more systems. Obviously they are great for a shared datacenter, or a small datacenter where rack space is at a premium. However for a home user that sort of thing isn't a big deal.

You could go the route that some home-rendering folks did and just assemble a farm out of file cabinets and cheap ATX systems. You could easily build 2 or more quad core machines for the price of just one blade/1U server. The quality and reliability aren't the same, but it is a viable option.
This is basically what I'm doing right now and it's working out well other than space. I'm wondering now if my logic is sound though:

Right now my thinking is that if I'm spending $320 on a 2600k, $60 on a HDD, $60 on a PSU, $300 on RAM, $100 on Win7, and at least $50 on a case and $150 on a mobo...that I may as well just bite the bullet and buy a $80 H70 loop, upgrade the mobo and case slightly and have a computer that runs 50% faster than stock @ 5ghz.

Unfortunately right now that also means using mid tower cases to house this stuff.

I think for my purposes, it's totally acceptable to have 5-10 ATX cases lying around the house, I just liked the elegant solution that you can get with these small server chassis.

So from my understanding so far from this thread (which has been really informative and why I love this forum) is that a "blade" is a high density computing solution that goes hand in hand with the blade enclosure, where the blade actually plugs into it and isn't simply ghetto-wired up from the back like you would with 1U machines.

Also one of the beautiful things about my workflow is that I don't split up my jobs, all machines crunch on the same frame at the same time. This may cost me 5-10% rendering speed, but the benefits are huge... As long as my main computer is solid and running the job, slaves can fail, crash, lose network connections and whatever other nightmare things you can think of...then simply reboot that machine and it syncs back up with the rest and begins assisting in rendering again.

For small renders this isn't so important, but I have many jobs that are several hours per frame WITH 4 Sandy Bridge machines @ 5ghz crunching all together. If I were to lose one of those frames to a crash, it could blow deadlines.
2P Workstation
(13 items)
 
  
CPUCPUMotherboardGraphics
Xeon E5 2699 v3 Xeon E5 2699 v3 ASUS Z10PE-D16 EVGA TITAN X 
RAMHard DriveHard DriveCooling
128GB Crucial DDR4 @ 2133MHz Intel 750 Series 1.2TB 4 x 4TB Constellation RAID5 2 x XSPC Raystorms 
CoolingCoolingMonitorMonitor
RS480 MCP655 Dell U3011 Dell U3011 
Case
Silverstone TJ07 
  hide details  
Reply
2P Workstation
(13 items)
 
  
CPUCPUMotherboardGraphics
Xeon E5 2699 v3 Xeon E5 2699 v3 ASUS Z10PE-D16 EVGA TITAN X 
RAMHard DriveHard DriveCooling
128GB Crucial DDR4 @ 2133MHz Intel 750 Series 1.2TB 4 x 4TB Constellation RAID5 2 x XSPC Raystorms 
CoolingCoolingMonitorMonitor
RS480 MCP655 Dell U3011 Dell U3011 
Case
Silverstone TJ07 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Servers
Overclock.net › Forums › Specialty Builds › Servers › Where to buy blade chassis?