Overclock.net › Forums › Components › Hard Drives & Storage › RAID Controllers and Software › PERC 5/i RAID Card: Tips and Benchmarks
New Posts  All Forums:Forum Nav:

PERC 5/i RAID Card: Tips and Benchmarks - Page 464

post #4631 of 7150
Quote:
Originally Posted by binormalkilla View Post
Porpoisehork did you connect the drives via SATA to enabled TLER, or were you able to do it via SAS?
I had to connect them individually to SATA ports with RAID disabled, or the utility wouldn't see them.

While you are doing TLER you should also do WDIDLE3 ... the western digital utility that lets you set the spindown timeout. The Load Cycle Count (number of times a hard disk is rated to park its heads) is usually around 300,000 for a standard desktop drive. I didn't enable it on my original WD15EARS array. I was poking around in HD Tune (Health tab when drive is plugged in individually) and noticed that in 5 months the drive was already through 60% of it's rated parks.

I've attached a zip of both utilites. It includes a virtual floppy disk image you can use to create a bootable floppy or CD-ROM.

Here are links about S.M.A.R.T. (Load Cycle Time), WDIDLE3 and WDTLER.
post #4632 of 7150
Quote:
Originally Posted by porpoisehork View Post
I had to connect them individually to SATA ports with RAID disabled, or the utility wouldn't see them.

While you are doing TLER you should also do WDIDLE3 ... the western digital utility that lets you set the spindown timeout. The Load Cycle Count (number of times a hard disk is rated to park its heads) is usually around 300,000 for a standard desktop drive. I didn't enable it on my original WD15EARS array. I was poking around in HD Tune (Health tab when drive is plugged in individually) and noticed that in 5 months the drive was already through 60% of it's rated parks.

I've attached a zip of both utilites. It includes a virtual floppy disk image you can use to create a bootable floppy or CD-ROM.

Here are links about S.M.A.R.T. (Load Cycle Time), WDIDLE3 and WDTLER.

Thanks, I'll have to do that sometime soon....I have an electromagnetics test coming up Thursday so I'm a bit busy ATM. It's going to take some time for me to remove the drives from their enclosures. I'm not looking forward to it.....
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
post #4633 of 7150
Quote:
Originally Posted by binormalkilla View Post
It's going to take some time for me to remove the drives from their enclosures. I'm not looking forward to it.....
Before you remove them all and test them make sure you try one first. I did remove all my EARS drives, and found out TLER Didn't work ... so it was a complete waste since I didn't know about WDIDLE at the time. That does seem to work on all drives, but you never know
post #4634 of 7150
Quote:
Originally Posted by porpoisehork View Post
Before you remove them all and test them make sure you try one first. I did remove all my EARS drives, and found out TLER Didn't work ... so it was a complete waste since I didn't know about WDIDLE at the time. That does seem to work on all drives, but you never know
I may just wait and see how it goes, since I haven't dropped to degraded once since I reformatted to ext4 (plus only boot to Win7 for gaming)........

If there is a chance that they all won't support it then I'm not sure I want to spend my time on that. Add that to the fact that I have at least 3 different batches of WD Caviars......
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
post #4635 of 7150
Quote:
Originally Posted by binormalkilla View Post
It looks like you're taking quite a hit on access time. Maybe you should see how RAID5 performs and compare the two.
Well, I'm just confused where the poor performance is coming from.

If I benchmark across the entire RAID 5 or 50 the average access speed is ~20ms for the 6-Drive 10/8TB EADS array. If I partition it up into 5/4 Equal partitions the worst average read speed is only 12.5ms.

My WD15EARS array is 10ms no matter how I test it (full, partitioned, ...).

I guess I should focus on more real-world testing, though my RAID 50 EADS array was getting about half of the read speeds of my RAID 5 EARS array

All settings were the same between the two arrays, and they are in identical PCs, with the same BIOS and RAID firmware. Both have backup batteries as well. If anyone has any ideas I'd be glad to hear them.
post #4636 of 7150
Quote:
Originally Posted by porpoisehork View Post
Well, I'm just confused where the poor performance is coming from.

If I benchmark across the entire RAID 5 or 50 the average access speed is ~20ms for the 6-Drive 10/8TB EADS array. If I partition it up into 5/4 Equal partitions the worst average read speed is only 12.5ms.

My WD15EARS array is 10ms no matter how I test it (full, partitioned, ...).

I guess I should focus on more real-world testing, though my RAID 50 EADS array was getting about half of the read speeds of my RAID 5 EARS array

All settings were the same between the two arrays, and they are in identical PCs, with the same BIOS and RAID firmware. Both have backup batteries as well. If anyone has any ideas I'd be glad to hear them.
Now that I think about it the doubled access time makes sense....there is an extra layer of delay (propagation delay) with the nested RAID level....I didn't really know about nested RAID levels until recently, but if their delay is accounted for like any other digital component then the doubled access time makes sense. It seems that you sacrifice access time in order to gain read throughput.


Edited by binormalkilla - 10/12/10 at 10:58pm
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
post #4637 of 7150
Quote:
Originally Posted by binormalkilla View Post
Now that I think about it the doubled access time makes sense....there is an extra layer of delay (propagation delay) with the nested RAID level....I didn't really know about nested RAID levels until recently, but if their delay is accounted for like any other digital component then the doubled access time makes sense. It seems that you sacrifice access time in order to gain read throughput.
I'd agree, given the diagrams and what I read. The part I'm confused about is why my RAID5 has the same 20ms access time ... the diagram makes me believe RAID 50 would be 20ms and RAID 5 would be 10ms, or basically 10ms per RAID layer ... but I guess not ...

I'm reformatting the array to do some real world file tests tomorrow. Hopefully that may fix my issues, or help me narrow it down.

Thanks for the help!

Edit: The issue for the poor performance turned out to be an issue with the motherboard. Swapping it out for a new one solved the issue. The disks still show a 15ms average access, but realtime performance is equavalent to the WD15EARS and the other WD20EADS array I tested against.
Edited by porpoisehork - 10/16/10 at 12:03pm
post #4638 of 7150
Quote:
Originally Posted by porpoisehork View Post
I'd agree, given the diagrams and what I read. The part I'm confused about is why my RAID5 has the same 20ms access time ... the diagram makes me believe RAID 50 would be 20ms and RAID 5 would be 10ms, or basically 10ms per RAID layer ... but I guess not ...

I'm reformatting the array to do some real world file tests tomorrow. Hopefully that may fix my issues, or help me narrow it down.

Thanks for the help!
Oh I see, your RAID5 also had 20ms access time....well that's weird. Something is definitely not right there.
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
post #4639 of 7150
Hi

I'm running 8x2TB Samsung F3 5400RPM in Raid5 on my Perc5/i but I think the performance is not what it should be. With 4 disks I had around 300MB/s Read and Write so guess these speeds is kind of low for 8 disks.



I think I've tried all different settings (No read ahead, Always read ahead, Adaptive etc) but I can't get any better result.
What could it be?
LL
LL
LL
LL
post #4640 of 7150
Quote:
Originally Posted by diehardfan View Post
hey guys,

I just got my Perc 6/i with battery backup unit . I am about to configure it with 4 X 2TB Hitachi drives in Raid 10. I will be using this setup mainly to save backup images of computers, pictures and home videos and important documents.

So what stripe size should I use? Also, what other setting should I enable/disable/change for optimum performance and stability? I am new to RAID

Cheers
Anyone?

Thanks
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: RAID Controllers and Software
Overclock.net › Forums › Components › Hard Drives & Storage › RAID Controllers and Software › PERC 5/i RAID Card: Tips and Benchmarks