Overclock.net › Forums › Specialty Builds › Servers › Post Your Server!!!
New Posts  All Forums:Forum Nav:

Post Your Server!!! - Page 376

post #3751 of 4324
Quote:
Originally Posted by loud681 View Post

I tend to prefer to have a dedicated physical computer as a DC since it is the core of a domain. But since this isn't a production environment i would have a VM DC on each server for failover.....just a thought

Seriously? At work all our DCs are running on VMware with VMotion/DRS... running a DC in a vm isn't going to slow it down if you give priority to the ressources allocated to it properly thumb.gif
Edited by EvilMonk - 7/8/16 at 4:27pm
Core i7-4790k
(16 items)
 
   
CPUCPUMotherboardGraphics
Intel Xeon X5670 Intel Xeon X5670 Apple Mac Pro 5,1 GeForce GTX 770 
RAMHard DriveOptical DriveOS
48Gb RDIMM ECC DDR3 1333 2 Apricorn Velocity Duo X2 + 2x Crucial M550 512Gb 1x Apple Superdrive / 1x Pioneer BR-208D OS X 10.10.4 
MonitorKeyboardPowerCase
2xApple LED Cinema 27" Apple BT Keyboard 980w Apple Apple Mac Pro 
MouseMouse Pad
Apple Magic Mouse BT Apple Magic TrackPad 
CPUMotherboardGraphicsRAM
Intel Xeon E5-1680 v2 Apple Mac Pro 6,1 2x AMD FirePro D700 6Gb GDDR5 32Gb DDR3 1866 ECC Registered Quad Channel OWC 
Hard DriveOptical DriveOSMonitor
Apple 512 Gb SSD Asus USB 3.0 BD-RW Mac OS X 10.10.2 Apple 27" Thunderbolt LED Display 
KeyboardMouseMouse PadAudio
Apple Bluetooth Keyboard Apple Magic Mouse Corsair MM200 Wide Bose Companion 20 Multimedia 
Other
Wacom Intuos Pen & Touch Medium Tablet 
  hide details  
Reply
Core i7-4790k
(16 items)
 
   
CPUCPUMotherboardGraphics
Intel Xeon X5670 Intel Xeon X5670 Apple Mac Pro 5,1 GeForce GTX 770 
RAMHard DriveOptical DriveOS
48Gb RDIMM ECC DDR3 1333 2 Apricorn Velocity Duo X2 + 2x Crucial M550 512Gb 1x Apple Superdrive / 1x Pioneer BR-208D OS X 10.10.4 
MonitorKeyboardPowerCase
2xApple LED Cinema 27" Apple BT Keyboard 980w Apple Apple Mac Pro 
MouseMouse Pad
Apple Magic Mouse BT Apple Magic TrackPad 
CPUMotherboardGraphicsRAM
Intel Xeon E5-1680 v2 Apple Mac Pro 6,1 2x AMD FirePro D700 6Gb GDDR5 32Gb DDR3 1866 ECC Registered Quad Channel OWC 
Hard DriveOptical DriveOSMonitor
Apple 512 Gb SSD Asus USB 3.0 BD-RW Mac OS X 10.10.2 Apple 27" Thunderbolt LED Display 
KeyboardMouseMouse PadAudio
Apple Bluetooth Keyboard Apple Magic Mouse Corsair MM200 Wide Bose Companion 20 Multimedia 
Other
Wacom Intuos Pen & Touch Medium Tablet 
  hide details  
Reply
post #3752 of 4324
Quote:
Originally Posted by loud681 View Post

I've seen more VM's fail then dedicated boxes.......

Well those VM were probably built by someone who doesn't know what he's doing with servers / server OSes and never heard of VMotion / DRS / high availability architectures...
Core i7-4790k
(16 items)
 
   
CPUCPUMotherboardGraphics
Intel Xeon X5670 Intel Xeon X5670 Apple Mac Pro 5,1 GeForce GTX 770 
RAMHard DriveOptical DriveOS
48Gb RDIMM ECC DDR3 1333 2 Apricorn Velocity Duo X2 + 2x Crucial M550 512Gb 1x Apple Superdrive / 1x Pioneer BR-208D OS X 10.10.4 
MonitorKeyboardPowerCase
2xApple LED Cinema 27" Apple BT Keyboard 980w Apple Apple Mac Pro 
MouseMouse Pad
Apple Magic Mouse BT Apple Magic TrackPad 
CPUMotherboardGraphicsRAM
Intel Xeon E5-1680 v2 Apple Mac Pro 6,1 2x AMD FirePro D700 6Gb GDDR5 32Gb DDR3 1866 ECC Registered Quad Channel OWC 
Hard DriveOptical DriveOSMonitor
Apple 512 Gb SSD Asus USB 3.0 BD-RW Mac OS X 10.10.2 Apple 27" Thunderbolt LED Display 
KeyboardMouseMouse PadAudio
Apple Bluetooth Keyboard Apple Magic Mouse Corsair MM200 Wide Bose Companion 20 Multimedia 
Other
Wacom Intuos Pen & Touch Medium Tablet 
  hide details  
Reply
Core i7-4790k
(16 items)
 
   
CPUCPUMotherboardGraphics
Intel Xeon X5670 Intel Xeon X5670 Apple Mac Pro 5,1 GeForce GTX 770 
RAMHard DriveOptical DriveOS
48Gb RDIMM ECC DDR3 1333 2 Apricorn Velocity Duo X2 + 2x Crucial M550 512Gb 1x Apple Superdrive / 1x Pioneer BR-208D OS X 10.10.4 
MonitorKeyboardPowerCase
2xApple LED Cinema 27" Apple BT Keyboard 980w Apple Apple Mac Pro 
MouseMouse Pad
Apple Magic Mouse BT Apple Magic TrackPad 
CPUMotherboardGraphicsRAM
Intel Xeon E5-1680 v2 Apple Mac Pro 6,1 2x AMD FirePro D700 6Gb GDDR5 32Gb DDR3 1866 ECC Registered Quad Channel OWC 
Hard DriveOptical DriveOSMonitor
Apple 512 Gb SSD Asus USB 3.0 BD-RW Mac OS X 10.10.2 Apple 27" Thunderbolt LED Display 
KeyboardMouseMouse PadAudio
Apple Bluetooth Keyboard Apple Magic Mouse Corsair MM200 Wide Bose Companion 20 Multimedia 
Other
Wacom Intuos Pen & Touch Medium Tablet 
  hide details  
Reply
post #3753 of 4324
Quote:
Originally Posted by EvilMonk View Post

Well those VM were probably built by someone who doesn't know what he's doing with servers / server OSes and never heard of VMotion / DRS / high availability architectures...

Some people don't want to spend the money. Worked for a customer that had HA, but no DRS. Their cluster was so tight on ram they had a spreadsheet with each VM, it's memory, total available mem on each host, then a list of which VMs belong to which host. So if it went down, that's where you put them back to. I asked them what about vmware updates, need to restart hosts to do it. They hadn't updated vmware since 5.0, at this point 5.5 had been out for over a year...
post #3754 of 4324

I was going to post photos of my server but I've had to send everything back. I couldn't get the board, CPU and RAM to work together so after trying several different sticks (registered and unbuffered) and even two CPU's (X3430 and G6950) I'm getting a refund and going with an AMD AM3 CPU and Asus board. FreeNas forums can claim that AMD are no good all they like but the one stick of unbuffered RAM works just fine with my FX8320 but not with the Intel system so after two weeks of fighting with it I am going with an AMD system. As I intend to run Raid 5 I might just install Win 8.1 and run Spaces with parity.

The girlfriend.
(15 items)
 
The Mistress
(13 items)
 
Media Server
(11 items)
 
CPUMotherboardGraphicsRAM
A8-6410 Lenovo Lancer 4B2 K16.3 R5 128 Shaders/M230 Hynix 8GB DDR3 1600 
Hard DriveHard DriveOSMonitor
Samsung 840 120 GB SSD Seagate Momentus 1TB 5400rmp Win 8.1 CMN1487 TN LED 14" 1366*768 
KeyboardPowerMouseMouse Pad
Lenovo AccuType 2900mAh/41Wh Elan Trackpad/Logitech M90 Super Flower 
Audio
AMD Avalon(Connexant) 
  hide details  
Reply
The girlfriend.
(15 items)
 
The Mistress
(13 items)
 
Media Server
(11 items)
 
CPUMotherboardGraphicsRAM
A8-6410 Lenovo Lancer 4B2 K16.3 R5 128 Shaders/M230 Hynix 8GB DDR3 1600 
Hard DriveHard DriveOSMonitor
Samsung 840 120 GB SSD Seagate Momentus 1TB 5400rmp Win 8.1 CMN1487 TN LED 14" 1366*768 
KeyboardPowerMouseMouse Pad
Lenovo AccuType 2900mAh/41Wh Elan Trackpad/Logitech M90 Super Flower 
Audio
AMD Avalon(Connexant) 
  hide details  
Reply
post #3755 of 4324
I've updated my servers / in the process of upgrading servers. I just moved in to a new place that doesn't have room for my 48u rack...I'm kind of bummed about it. That being said, this set up will work for now.



So let's get started (from bottom up)

Dell PowerEdge R710
  • 2x Quad Core Xeon
  • 64 GB DDR3 *work in progress*
  • 6 x 600 GB 15k 3.5" SAS drives *work in progress*
  • ESXi 6

IBM System X3690 X5
  • 2x Intel Xeon E7-2803 6-core @ 1.73GHz
  • 24x 8 GB DDR3 ECC (Total 192 GB)
  • IBM ServeRaid M1015
  • 8x 146 GB 15k RPM 2.5" Hard Drives *work in progress, have 2*

Dell PowerEdge 2850
  • 2x Dual Core Xeon
  • 16 GB DDR2 ECC *work in progress, have 8 GB*
  • 34x 300 GB 10k RPM 3.5" SCSI (6 in the 2850, 14x in each Powervault 220s)

Powervault 220s

Imaging Station

Going from the bottom up, there is currently no OS installed on the PowerEdge R710. I don't have the drives in for it yet so don't have a way to run anything. I'm hoping that after I finish paying some medical bills I'll be able to get back to my toys. The System X3690 has ESXi 6 with several different VMs running. I'm running my domain through a couple of Windows Server 2012 R2 installs, I have an Ubuntu server running and authenticating through AD, and a Minecraft server set up for now. Every VM has its own physical connection. I've barely touched the system resources on this beast. The PowerEdge 2850 has FreeNAS installed and is directly connected to the X3690 running the storage over 2 gigabit network connections. I'm running a Cisco 24 port managed switch w/POE and a Meraki MR12 POE AP.

My imaging station is used to rebuild workstations for my clients. It's very basic now as I don't have my normal setup anymore, but it still functional. I'm hoping to get my 8 port KVM set up again so I can rack them all. That's all for now. If y'all have any suggestions or
post #3756 of 4324
Quote:
Originally Posted by vaeron View Post

I've updated my servers / in the process of upgrading servers. I just moved in to a new place that doesn't have room for my 48u rack...I'm kind of bummed about it. That being said, this set up will work for now.



So let's get started (from bottom up)

Dell PowerEdge R710
  • 2x Quad Core Xeon
  • 64 GB DDR3 *work in progress*
  • 6 x 600 GB 15k 3.5" SAS drives *work in progress*
  • ESXi 6

IBM System X3690 X5
  • 2x Intel Xeon E7-2803 6-core @ 1.73GHz
  • 24x 8 GB DDR3 ECC (Total 192 GB)
  • IBM ServeRaid M1015
  • 8x 146 GB 15k RPM 2.5" Hard Drives *work in progress, have 2*

Dell PowerEdge 2850
  • 2x Dual Core Xeon
  • 16 GB DDR2 ECC *work in progress, have 8 GB*
  • 34x 300 GB 10k RPM 3.5" SCSI (6 in the 2850, 14x in each Powervault 220s)

Powervault 220s

Imaging Station

Going from the bottom up, there is currently no OS installed on the PowerEdge R710. I don't have the drives in for it yet so don't have a way to run anything. I'm hoping that after I finish paying some medical bills I'll be able to get back to my toys. The System X3690 has ESXi 6 with several different VMs running. I'm running my domain through a couple of Windows Server 2012 R2 installs, I have an Ubuntu server running and authenticating through AD, and a Minecraft server set up for now. Every VM has its own physical connection. I've barely touched the system resources on this beast. The PowerEdge 2850 has FreeNAS installed and is directly connected to the X3690 running the storage over 2 gigabit network connections. I'm running a Cisco 24 port managed switch w/POE and a Meraki MR12 POE AP.

My imaging station is used to rebuild workstations for my clients. It's very basic now as I don't have my normal setup anymore, but it still functional. I'm hoping to get my 8 port KVM set up again so I can rack them all. That's all for now. If y'all have any suggestions or

I don't understand the fascination with ESXi.... I find Proxmox to be much better.
post #3757 of 4324
Description / Usage: Backups, Media & Network Storage

OS: Ubuntu Server 14.04 LTS (headless)
Case: Phobia Open-Air Bench Case
CPU: X5660 (not oc'd yet)
Motherboard: Asus P6X58D-E
Memory: 24GB (6x4GB) DDR3
PSU: Thermaltake Toughpower 750Watt
OS SDD (If you have one): Samsung 850 Evo 250GB
SAS/HBA: LSI 9211-8i Host Bus Adapter (2x SAS = 8x SATA 6Gbps)
Storage HDD(s):
Currently have a SnapRAID pool setup with 1 parity (soon to be 2 parity) and (5 data disks [1 isnt in the system right now])
  1. Samsung 840 Evo 250GB (For VM Vdisks)
  2. 2TB WD Blue (Data)
  3. 2TB WD Black (Data)
  4. 3TB WD Blue (Data)
  5. 2TB WD Green (Data)
  6. 4TB WD Black (Parity)
  7. 4TB WD Red (Data - Currently getting filled up from a mates server)

Server Manufacturer: Me (re-purposing an old - but still ver capable - rig)

You can see in one of the pics theres a 2 port HP gigabit NIC with x1 PCIe connection to go in the last free PCIe slot on the board to give a total of 3 gigabit ports. I'll probably dedicate one to the VM, one to the server for direct connection to the net, and one for an always-on VPN connection that certain traffic is routed to.
Alternatively I'll let the VM share the nic with the host, and aggregate the two ports on the HP NIC to get full bandwidth to multiple machines from the server. (If anyone has any tips or tricks for this on Ubuntu Server 14.04LTS, my ears are open biggrin.gif )

Next plan is to setup a 6 disk ZFS array (3 x (Mirrored-2disk zPools) striped together) - which apparently is the fastest setup as well as extremely realiable - for another backup server and fast network attached storage (SnapRAID performance is the same as a single disk, as there is no on-the-fly parity calculations and data is not striped across disks - and I would like to have some faster, secure, large volume storage than that if possible)













Edited by spinFX - 7/11/16 at 9:31pm
Ol'Faithful
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Xeon x5660 P6X58D-E XFX AMD Radeon R9 280X Corsair Vengeance 24 GB Triple Chanel 6x4GB 
Optical DriveCoolingOSMonitor
Lite-On DVD-RW Noctua DH-14 CPU Cooler Windows 7 Pro x64 Samsung 24" LED 1080p 5ms DVI (S23A300B) 
PowerMouseAudio
Thermaltake Toughpower 750W Gold Corsair M95 Onboard :S 
  hide details  
Reply
Ol'Faithful
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Xeon x5660 P6X58D-E XFX AMD Radeon R9 280X Corsair Vengeance 24 GB Triple Chanel 6x4GB 
Optical DriveCoolingOSMonitor
Lite-On DVD-RW Noctua DH-14 CPU Cooler Windows 7 Pro x64 Samsung 24" LED 1080p 5ms DVI (S23A300B) 
PowerMouseAudio
Thermaltake Toughpower 750W Gold Corsair M95 Onboard :S 
  hide details  
Reply
post #3758 of 4324
Quote:
Originally Posted by nexxusty View Post

I don't understand the fascination with ESXi.... I find Proxmox to be much better.

I've used ESXi for years and it's never let me down. Never used Proxmox. What do you find that is better about it?
post #3759 of 4324
Quote:
Originally Posted by vaeron View Post

I've used ESXi for years and it's never let me down. Never used Proxmox. What do you find that is better about it?

Same here for Proxmox. I haven't used ESXi as Proxmox has served me well.

I just don't understand everyone using ESXi for a VM box instead of Proxmox. Is ESXi free?
post #3760 of 4324
Yes it is. It also came embedded on all of my servers and just works.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Servers
Overclock.net › Forums › Specialty Builds › Servers › Post Your Server!!!