Overclock.net banner

1 - 7 of 7 Posts

·
Registered
Joined
·
90 Posts
Discussion Starter #1
Hi everyone, (skip to second paragraph for the question) I'm currently conducting an experiment on how defragmentation and short stroking help to reduce access times and improve read/write speeds. One of the things I'm trying to do is testing whether a performance boost from short stroking can still be retained by storing operating system files into a partition of lets say 20 GB and storing all of the other files into the remaining space in the other partition. Of course there will be drive overhead by having to access the files in the slower partition but I was thinking that a good defragmentation could compensate for that slight performance decrease.

My question is does anyone know of a good way or a program that can cause a good amount of fragmentation in the hard drive? I also want to be able to set the size of the fragmentation of the files like 20 GB, 50 GB, etc until it fills up the entire hard drive (around 600 GB) so I can also measure performance when the hard drive is being filled up. I can't seem to think of a good way to do this besides copying a lot of files into random folders, which will take a really long time (I'm also not sure if it'll cause significant fragmentation either since they're all chunked when copied).
 

·
Overclock Failed...
Joined
·
13,565 Posts
Quote:


Originally Posted by Eccentric
View Post

I also want to be able to set the size of the fragmentation of the files like 20 GB, 50 GB, etc

Could you explain this a little better?
Fragmentation doesn't have a size.
 

·
Registered
Joined
·
90 Posts
Discussion Starter #3
Quote:


Originally Posted by billbartuska
View Post

Could you explain this a little better?
Fragmentation doesn't have a size.

I meant like using up the space in the hard drive by adding more files. Like I would like a program that could randomly add data onto the hard drive by specifying how much space I want to use up (i.e. 150 GB, 300 GB, etc). But by adding the data I also want to make sure they're fragmented too. If I just copied a 1 GB file over and over again in the same folder it wouldn't be as fragmented as if I copied 1000 mp3s in different places.
 

·
Overclocker
Joined
·
867 Posts
copying files into random folders is will cause no fragmentation whatsoever, fragmentation happens when the drive saves a file in the first space it finds vacant and if the space is not big enough for the file then the rest of the file is stored in the next vacant space and if that isnt big enough then the rest is stored in another space and so on, finding a program that fragments the drive for you I dont think you will find one.
Short stroking is pretty straight forward all you are doing is using the outside of the platter because the surface speed past the heads is faster with a bigger diameter, any time the head has to read in near to the center then the read rate is slower weather or not the files are fragmented of course the read will be slower as the head has to jump back and forth if the files are fragmented.
Anyway Vista and Win7 are set standard to defragment the drive once a week so fragmentation shouldnt be much of a problem unless you turn it off
 

·
Premium Member
Joined
·
4,496 Posts
Write a .bat file that does a dir /s c:\\ > file, and copy that file to a unique file name in c:\\1 and c:\\2 repeatadly. When done, erase c:\\1, and 'type' that file over and over into one large file.
Make sense? I've done this, but it takes some time..
..a
 

·
Premium Member
Joined
·
5,330 Posts
The problem caused by fragmentation is that the HDD heads have to perform an additional seek (or multiple seeks) during reading each file - adding the amount of time taken for that seek on to the time required to process the read request for the file (an average time equal to the average access time of the drive).

By storing files away from the main OS files - which are accessed frequently - you force the heads to track back and forth between your partitions - effectively meaning you add an access time in every time the heads go back and forth.

Bascially what you are doing is creating one problem to solve another. You would be better off just running your defragmenter more often to avoid fragmentation.

If you want to short-stroke, my advice is to do it properly and understand exactly why you are doing it. By keeping all of the files on your disk together you minimise the amount of time taken for each seek (as the maximum seek distance is shorter). Creating a setup where you increase the number of long seeks deliberately is counter-productive, and you would likely end up with worse performance than if you just left your disks completely unpartitioned.
 
1 - 7 of 7 Posts
Top