Overclock.net › Forums › Specialty Builds › Servers › Big WHS Build - Need some pro advice
New Posts  All Forums:Forum Nav:

Big WHS Build - Need some pro advice

post #1 of 15
Thread Starter 
OK So with WHS 2011 coming I have 2 problems.

1) My current solution e8500 4gb ram 4x 2TB drives has passed the point of being able to allow duplication (not nough space left)
2) WHS 2011 I will need raid as I do not want multiple shares etc

First issue of Case I don't want a rack mount server. I'm interested in the Mountain Mods U2-UFO [url]http://www.mountainm...d-top-p-79.html[/url] reason being I can fit 18 hard drives internal with additional brackets [url]http://www.mountainm...lack-p-325.html[/url] and I can use it as my printer stand The price also does not seem so bad when you look at rack mount solutions that can handle that many drives and the added cost of 2u psu's.

My hard Drive Plan is as follows

System Drive 2x Raid 1 Drives something reliable and cheap figured I would run this off the motherboards built in controller. I have 4x 7200rpm Seagate 7200.10 Barracuda 320GB drives I used for like 1 month. Figure If I use 2 I have 2 spares for the future.

I currently have 4x Hitachi Deskstar 5K3000 2TB Drives they are cheap, fast and I have had no issues with them.
[url]http://www.newegg.co...N82E16822145475[/url]

So I figured I would add 12 More for a total of 16 2TB Drives

Raid Card to support it...

It would seem these are my choices
[url]http://www.newegg.co...ING&PageSize=20[/url]

I don't know much abot the current state of raid I used tons of raid in the past so I'm confident I can figure it out.. Looking for some suggestions on the raid card.

Raid Config Questions: Any isses with 18 Total drives?
I was thinking about doing raid 10 I'm very much into complete redundency and never having to worry so with raid 10 I would get 16TB of usable space. Anyone have another solution that would provide a similar level of redundency with more usable space?

I know its alot but I do not want to lose my data. I have 2 cable modems load balanced with a 4g modem for failerover to give you an idea of how much I hate something not working. (mental issues maybe.. )

For the rest of the system I was thinking something like a 2500k Basic Asus motherboard.
PSU.. Never even worried about power usage of my hard drives before. What would i need for a Sandybridge system with the raid card and 18 drives?
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
post #2 of 15
Mountain mods can only hold 18 drives?

Anywho..

The marvell 9128 ports on the ASUS boards will handle port multiplying but that is only 2 ports, if performance is not a big deal that is a easy way to at least 8 drives hooked up.

The 6 other ports are based on intel controller, and do not support port multiplication except in hardware raid configs.

For example.. I reviewed the convoy 425XL recently. It accepts 4 2.5" in a single 5.25" bay. When plugged into the Marvell ports on the p8p67 deluxe, I had the option of using the built in HW raid controller which it is designed with just like when plugged into the Intel ports. However the Marvell controller also saw the drives plugged into the convoy individually. So I could use Marvell raid, or just have separate drives assigned.

The convoy is based on the JM393 controller, and I never tested it with 4 drives, let alone trying 8. I only have SSDs in that format and 2 was a waste on it.

EDIT: I will be following this thread though as storage performance and adaptability is important to me. Which is why I would not use the last gen WHS, but new gen... maybe. Getting rid of the drive extender stuff makes the OS more interesting to me.
Edited by Neur0mancer - 3/5/11 at 5:56pm
24/7
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 920 d0 Rampage III Formula ASUS EAH4890 OCZ Reapers 
Hard DriveOptical DriveOSMonitor
Corsair SSD SATA cheapie 7 x64 28" 22" 
KeyboardPowerCaseMouse
see mouse Corsair 750W Rosewill midtower MS Enteratinment desktop 8000 
Mouse Pad
Taped gel pad thing 
  hide details  
Reply
24/7
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 920 d0 Rampage III Formula ASUS EAH4890 OCZ Reapers 
Hard DriveOptical DriveOSMonitor
Corsair SSD SATA cheapie 7 x64 28" 22" 
KeyboardPowerCaseMouse
see mouse Corsair 750W Rosewill midtower MS Enteratinment desktop 8000 
Mouse Pad
Taped gel pad thing 
  hide details  
Reply
post #3 of 15
Thread Starter 
Quote:
Originally Posted by Neur0mancer;12625824 
Mountain mods can only hold 18 drives?

Anywho..

The marvell 9128 ports on the ASUS boards will handle port multiplying but that is only 2 ports, if performance is not a big deal that is a easy way to at least 8 drives hooked up.

The 6 other ports are based on intel controller, and do not support port multiplication except in hardware raid configs.

For example.. I reviewed the convoy 425XL recently. It accepts 4 2.5" in a single 5.25" bay. When plugged into the Marvell ports on the p8p67 deluxe, I had the option of using the built in HW raid controller which it is designed with just like when plugged into the Intel ports. However the Marvell controller also saw the drives plugged into the convoy individually. So I could use Marvell raid, or just have separate drives assigned.

The convoy is based on the JM393 controller, and I never tested it with 4 drives, let alone trying 8. I only have SSDs in that format and 2 was a waste on it.

EDIT: I will be following this thread though as storage performance and adaptability is important to me. Which is why I would not use the last gen WHS, but new gen... maybe. Getting rid of the drive extender stuff makes the OS more interesting to me.

It has 18 spots with standard drive bays. I'm sure you can add more in other ways (mods etc)

for performance etc I would rather use a high end raid controller like the one i linked. For the OS I will use the onboard raid would you suggest the Intel or Marvel? Just doing raid 1
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
post #4 of 15
have you heard of drive bender it is an application to replace Drive extender that will work with any windows OS. I am looking into to it to see if you can setup the app on one machine and then transfer the drives and the data to another machine, not requiring raid. If that will work I would consider getting one or 2 of this sata card. It has 8 ports and requires a 4x slot. It doesnt come with cables but they are another $20 a piece for each from newegg.
BENDER
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 920 D0 EVGA SLI 758 A1 Sapphire 4890 Vapor 6GB Gskill DDR3 1600 
Hard DriveOptical DriveOSMonitor
2 WD 250GB Raid 0 Plextor DVD-RW Windows 7 Professional HP 2207 
KeyboardPowerCaseMouse
Logitech G15 Corsair HX750 HAF-X Logitech G500 
  hide details  
Reply
BENDER
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 920 D0 EVGA SLI 758 A1 Sapphire 4890 Vapor 6GB Gskill DDR3 1600 
Hard DriveOptical DriveOSMonitor
2 WD 250GB Raid 0 Plextor DVD-RW Windows 7 Professional HP 2207 
KeyboardPowerCaseMouse
Logitech G15 Corsair HX750 HAF-X Logitech G500 
  hide details  
Reply
post #5 of 15
Thread Starter 
Quote:
Originally Posted by ounderfla69;12628100 
have you heard of drive bender it is an application to replace Drive extender that will work with any windows OS. I am looking into to it to see if you can setup the app on one machine and then transfer the drives and the data to another machine, not requiring raid. If that will work I would consider getting one or 2 of this sata card. It has 8 ports and requires a 4x slot. It doesnt come with cables but they are another $20 a piece for each from newegg.

I have I'm going to stick with normal raid this time. If WHS blows up or I decide to switch to another solution my data is all safe and portable. More money but I should get better performance and my data should be safer
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
post #6 of 15
The problem with large RAID arrays is that your data *won't* be safer.

If you have 16 drives in a RAID10, and 1 of them fails, you now have a single point of failure for the whole lot. And if that drive also fails before the arrays rebuilds your whole array will become unrecoverable. A much better bet for large arrays (especially media storage, as I assume this is) is something like unRAID or FlexRAID, where you have separate drives protected by one (or more) parity drives. Under these systems if a single drive fails it can recovered, but if multiple drives fail (and the parity protection fails as a result), you only lose the data on the failed drives. This means you only lose a part of your collection - with full RAID you would have lost the lot. You also get important advantages like being able to use different drives in your array - so you can still use your current 2TB drives, but add in 3 and 4 TB drives (and beyond) as they become better value in the future. With RAID you are really better off if you buy all your hardware at once so it matches and you don't have to mess with time consuming and potentially problematic array expansions - but that means a lot of hardware costs.

If you do decide to stick with hardware RAID, then for multiple large drives RAID6 is better than RAID10, as you don't result in a single point of failure if a drive dies. But I really wouldn't recommend it, especially with 2TB drives in such a large array.

Can't see your hardware costs (and no real time to search) as your links don't work. But I suspect you will spend more money on that case than you would on a more appropriate rackmount solution with the bays built in. The Norco 20 bay cases are cheap, or even the Supermicro cases with PSUs and expanders built in become an option. For this many drives really the only connection method worth considering is a SAS-expander based approach - couple your choice of 4 port RAID card up to either the Supermicro backplane or the HP expander and you get up to 32 drives for much less money than a card capable of supporting 18 drives would cost. Again be warned that few RAID cards are certified for use with advanced (4K) format drives - and this makes a huge difference to your data security (and the compatibility jumper doesn't help either).

Regarding hardware - there is no reason to upgrade your current 4GB e8xxx system. You really won't get any benefit. Put your money into your drive systems instead and stick with what you have already.
post #7 of 15
Yeesh. 18 drives. And I thought I was crazy...
ESXi Host 1
(15 items)
 
  
CPUMotherboardGraphicsRAM
(2x) Intel Xeon E5520 Dell OnBoard Matrox G200 24GB DDR3 12x2GB UDIMMS (18 slots total) 
Hard DriveHard DriveHard DriveHard Drive
PERC6-RAID50 Intel 730 480GB Intel 320 300GB Synology DS414 iSCSI SAN 
OSMonitorKeyboardPower
VMWare vSphere5 Enterprise Plus Dell iDRAC6 Remote Management [KVM-Over-IP] Dell iDRAC6 KVM Dell Hot-Swap Redundant 1100W 
CaseMouse
Dell PowerEdge T710 Stock Dell iDRAC6 KVM 
  hide details  
Reply
ESXi Host 1
(15 items)
 
  
CPUMotherboardGraphicsRAM
(2x) Intel Xeon E5520 Dell OnBoard Matrox G200 24GB DDR3 12x2GB UDIMMS (18 slots total) 
Hard DriveHard DriveHard DriveHard Drive
PERC6-RAID50 Intel 730 480GB Intel 320 300GB Synology DS414 iSCSI SAN 
OSMonitorKeyboardPower
VMWare vSphere5 Enterprise Plus Dell iDRAC6 Remote Management [KVM-Over-IP] Dell iDRAC6 KVM Dell Hot-Swap Redundant 1100W 
CaseMouse
Dell PowerEdge T710 Stock Dell iDRAC6 KVM 
  hide details  
Reply
post #8 of 15
Thread Starter 
Quote:
Originally Posted by the_beast View Post
The problem with large RAID arrays is that your data *won't* be safer.

If you have 16 drives in a RAID10, and 1 of them fails, you now have a single point of failure for the whole lot. And if that drive also fails before the arrays rebuilds your whole array will become unrecoverable. A much better bet for large arrays (especially media storage, as I assume this is) is something like unRAID or FlexRAID, where you have separate drives protected by one (or more) parity drives. Under these systems if a single drive fails it can recovered, but if multiple drives fail (and the parity protection fails as a result), you only lose the data on the failed drives. This means you only lose a part of your collection - with full RAID you would have lost the lot. You also get important advantages like being able to use different drives in your array - so you can still use your current 2TB drives, but add in 3 and 4 TB drives (and beyond) as they become better value in the future. With RAID you are really better off if you buy all your hardware at once so it matches and you don't have to mess with time consuming and potentially problematic array expansions - but that means a lot of hardware costs.

If you do decide to stick with hardware RAID, then for multiple large drives RAID6 is better than RAID10, as you don't result in a single point of failure if a drive dies. But I really wouldn't recommend it, especially with 2TB drives in such a large array.

Can't see your hardware costs (and no real time to search) as your links don't work. But I suspect you will spend more money on that case than you would on a more appropriate rackmount solution with the bays built in. The Norco 20 bay cases are cheap, or even the Supermicro cases with PSUs and expanders built in become an option. For this many drives really the only connection method worth considering is a SAS-expander based approach - couple your choice of 4 port RAID card up to either the Supermicro backplane or the HP expander and you get up to 32 drives for much less money than a card capable of supporting 18 drives would cost. Again be warned that few RAID cards are certified for use with advanced (4K) format drives - and this makes a huge difference to your data security (and the compatibility jumper doesn't help either).

Regarding hardware - there is no reason to upgrade your current 4GB e8xxx system. You really won't get any benefit. Put your money into your drive systems instead and stick with what you have already.
The case will be about $400
http://www.mountainmods.com/u2ufo-mi...-top-p-79.html

I don't have a basement I live in an appartment in NYC. One of the added features of the Mountain mods case is it can function as my Printer stand

I don't understand the HP expander thing? How does that impact performance? I do plan on going raid 6 based on advice from another thread.

Plan is OS on 2 Drives Raid 1+1 (I have 4 320GB Seagate drives I used for like 30 days before I upgraded to SSD)

Data Would be Raid 6 with 16 drives with 1 Hot Spare (Maybe even 2 if possable)

Data Drives
http://www.newegg.com/Product/Produc...82E16822145475

Raid Card
http://www.newegg.com/Product/Produc...82E16816151035

The reasons for the new build is:
1) Current WHS is running on a Dell XPS 420 and I found in most cases trying to reuse dell stuff causes me headaches
2) I don't want to risk my data while doing the upgrade so is more of a migration I may do "Vale" on the new server copy the data over and with hardware raid I can just reinstall the OS when WHS 2011 is final.
3) I find that with the e8500 if I'm streaming a blueray while a backup is running and some downloads are writing I get some stuttering in playback.
4) I also plan on adding additional tasks to the server
5) I need this server to work with media as close to as flawless as it can. I want to dump Cable service and everytime it has a problem my girlfriend balks at turning cable off. I pay something like $100-$120 a month for Cable. the ROI on the system could be under 2 years. 9maybe even 1 if you could what she buys in PPV)
6) I tend to go overboard on the hardware but I would rather have more then I need then less.
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
post #9 of 15
It sounds like you already know what you want, and what you are doing

I love RAID for the speed, high availability, and definitely portability.
I switched from RAID to WHS over a year ago, because I had some power supply issues that caused me to lose 5TB of data. I may or may not go back to RAID.

- For just an HBA, the Supermicro SASLP is an awesome thing. It's what I use in my WHS v1.

- The HP Expander will take a single SAS port, and basically expand it.
You could basically buy a 4 port RAID card, use the HP expander... and run a 24 drive array.
If you have 24 drives expanded off of a single SAS port on a RAID card, the performance will be cut yes. Will you notice it or care? Probably not. Each SAS lane has 3gbps bandwidth I believe. Each SAS port has 4 lanes. There should be no problem with speed .

- I actually have a RAID card for sale. If you aren't looking to spend $900, check it out.
http://www.overclock.net/other-compo...raid-card.html
Used for about 1 month. Retail box everything included.
 
File Server
(8 items)
 
CPUGraphicsGraphicsGraphics
i7 6700HQ Intel HD530 AMD R450 nVidia GTX1070 Mini 
RAMHard DriveOSMonitor
16GB LP-DDR3 512GB NVMe macOS 3x QNIX QX2710 
KeyboardPowerCaseMouse
Mechanical Keyboard Disco 87w USB-C Apple Adapter Unibody Aluminum Zowie EC2 eVo 
Mouse PadAudioAudioAudio
Steelseries QCK 18" Audio Technica ATH-AD700X Rode Procaster Macki ProFX8v2 
CPUMotherboardGraphicsRAM
Dual Intel E5520 (4c/8t) Dell OEM Intel Onboard 24GB DDR3 
Hard DriveOSPowerCase
10x 8TB Seagate Archive SMR Ubuntu 16.04LTS Dual 750w Dell C2100 
  hide details  
Reply
 
File Server
(8 items)
 
CPUGraphicsGraphicsGraphics
i7 6700HQ Intel HD530 AMD R450 nVidia GTX1070 Mini 
RAMHard DriveOSMonitor
16GB LP-DDR3 512GB NVMe macOS 3x QNIX QX2710 
KeyboardPowerCaseMouse
Mechanical Keyboard Disco 87w USB-C Apple Adapter Unibody Aluminum Zowie EC2 eVo 
Mouse PadAudioAudioAudio
Steelseries QCK 18" Audio Technica ATH-AD700X Rode Procaster Macki ProFX8v2 
CPUMotherboardGraphicsRAM
Dual Intel E5520 (4c/8t) Dell OEM Intel Onboard 24GB DDR3 
Hard DriveOSPowerCase
10x 8TB Seagate Archive SMR Ubuntu 16.04LTS Dual 750w Dell C2100 
  hide details  
Reply
post #10 of 15
Thread Starter 
Quote:
Originally Posted by Bonz™ View Post
It sounds like you already know what you want, and what you are doing

I love RAID for the speed, high availability, and definitely portability.
I switched from RAID to WHS over a year ago, because I had some power supply issues that caused me to lose 5TB of data. I may or may not go back to RAID.

- For just an HBA, the Supermicro SASLP is an awesome thing. It's what I use in my WHS v1.

- The HP Expander will take a single SAS port, and basically expand it.
You could basically buy a 4 port RAID card, use the HP expander... and run a 24 drive array.
If you have 24 drives expanded off of a single SAS port on a RAID card, the performance will be cut yes. Will you notice it or care? Probably not. Each SAS lane has 3gbps bandwidth I believe. Each SAS port has 4 lanes. There should be no problem with speed .

- I actually have a RAID card for sale. If you aren't looking to spend $900, check it out.
http://www.overclock.net/other-compo...raid-card.html
Used for about 1 month. Retail box everything included.
I assume with a SAS expander you need sas drives? If o they are a boat load more money then the sata drives I'm looking at. Also I can't see your raid card I assume I don't have access?

edit never mind I can see it.
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
Gaming Rig
(13 items)
 
  
CPUMotherboardGraphicsRAM
I7 980x@4.5Ghz Asus Rampage III EVGA GTX580 Tri Sli Super Talent 2133 cl8 
Hard DriveOptical DriveOSMonitor
3x Intel X25m G2 Raid 0 Blue-Ray Win7 Ult x64 Dell 3007 
KeyboardPowerCaseMouse
Logitech G19 Silverstone 1500 Danger Den Double Wide Logitech G9x 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Servers
Overclock.net › Forums › Specialty Builds › Servers › Big WHS Build - Need some pro advice