Overclock.net banner
1 - 20 of 37 Posts

·
Windows Wrangler
Joined
·
2,288 Posts
Discussion Starter · #1 ·
Introduction
Several years ago, I was first alerted to the fact that defragmenters vary in the improvement that they bring to a system. For instance, back in the XP days, I noticed that BootVis would always provide a significantly faster boot time than Norton SpeedDisk on my computer. Re-defragmenting with SpeedDisk would slow the speeds down again. I read on what BootVis does, and decided that SpeedDisk's lack of performance was due to an inferior file layout. At this point, I tried several defragenters on my system, including Diskeeper, PerfectDisk, UltimateDefrag, JkDefrag and Auslogics defrag; while PerfectDisk seemed to do the best, it still did not get boot times as fast as BootVis. So I changed some settings in SpeedDisk and gave it a list of 250 system files (in order) to place first. It promptly made a mess of my system, creating some 300,000 fragments and then locking up. Starting it again only made the problem worse. That's when I started looking into defragmentation APIs, and ended up writing my own defragmenter. I've been using it exclusively for a while now, but when I'm asked to recommend a defragmenter, I've always been at a loss, because I really didn't know how the other defragmenters compared as far as actual computer performance was concerned, and mine isn't really mature yet.
So I finally did this test (it took me about a month, working on it off and on), and I was surprised at how poorly the majority of the defragmenters did-and I only tested the popular ones! Then there's a list of not-so-popular defragmenters that's just about as long (I did do a quick virtual machine test on many of those, but none were really worth mentioning). I was also surprised at how high the built-in Windows XP defragmenter scored (however, Microsoft redid their defragmenter in Windows Vista, and I don't know if they made it better or worse).

Summary
If your system boots from a HDD, you should definitely keep that HDD defragmented. As of this writing (June 2013), MyDefrag appears the best available tool for that task.

Considerations
What really is the main objective of defragmentation? Improving your computer's responsiveness, boot speed and application launch speeds. Also, having a defragmented disk (SSDs included) will also greatly increase the chances of successful file recovery in the event of filesystem corruption or accidental file deletion. However, this should not replace a proper backup.
wink.gif

That said, let's explore in theory the three main things that affect the performance of your disks (and thus the responsiveness of your computer):

#1: Seek time
This is the amount of time it takes for the heads on the HDD to find another spot in order to read data that is physically stored in a different place then the previous read/write. SSDs and other flash media (e.g. SD cards) are not affected by this and are only affected by access time (see #3 later). On a HDD, the seek time is generally proportional to the seek distance. For a defragmenter, this means that that in addition to removing fragments so that the HDD doesn't need to seek multiple times to read a single file, files that are frequently accessed in a sequence should be placed in access sequence so that the whole group of files can be read sequentially. Very few defragmenters do this. All the defragmenters that made the top of my benchmark test did this to varying amounts.
Also, there is a technique called "short-stroking" that takes advantage of this phenomenon. If you create a small partition at the beginning of the disk and install Windows (and your programs) there, all the seeks for Windows (and your programs) will be confined to that area, thus reducing the maximum seek distance. However, a good defragmenter will get better system performance than short-stroking alone, so short-stroking your system HDD shouldn't be necessary.

#2: Read speed
HDDs transfer data approximately 2x faster at the beginning of the disk than at the end of the disk. This is due to the greater distance (and thus higher speed) that the outer edge of the HDD's platters have to travel compared to the inner edge. SSDs and other flash media are not affected by this either. For a defragmenter, this means that best performance on a HDD will be achieved by moving frequently accessed files to the beginning of the disk, and sorting out rarely used data caches and moving them preferably to the end of the disk, where they will be out of the way, allowing the other files to be closer to the beginning. Quite a few defragmenters I tested at least consolidated all the files to the beginning of the disk; but very few selected the frequently used system files to be first (again, those that did got the best ratings). Even fewer did a good job at detecting which files belonged at the end of the group. Only one (my experimental defragmenter) actually moved those files to the end of the disk (although Norton SpeedDisk used to).

#3: Access time
This one affects all storage media. This is the combined amount of time it takes for the smallest I/O request to go through the filesystem driver (usually NTFS), the partition driver, the disk driver, the motherboard's north bridge, be interpreted and responded to by the storage device, and then the response (data) to pass back up through that same chain. This is why one large read will transfer data faster than a lot of short reads for the same amount of data. To read a file, at least one read will have to be made for each fragment. If there are many (hundreds or thousands of) small fragments in a file, this will impact performance, even on a SSD, because of all the short reads that will be done. All the defragmenters were able to remedy this (although UltraDefrag really did a bad job). However, several (including Auslogics, SmartDefrag and UltraDefrag) created many free-space fragments in the process. Unfortunately, this means that the file fragmentation would come back quickly and with vengeance in a short period of time, making the system dependent on regular defragmentation to keep the speed from lapsing to speeds slower than the original system.

The test system
So, let's see how the defragmenters performed in a real system. The test system was a Pentium 4 at 2.4 GHz with an 80 GB HDD and 1 GB of RAM running Windows XP.
I used a Pentium 4 computer mainly because I had several of them laying around. If I had used a faster CPU, the "defrag improvement" would have been a little higher, because at times, several of the programs were "CPU bound" when starting.
I used an 80 GB HDD because I had two of them (one for the backup image) and because a larger HDD would take longer to image. Many newer HDDs have faster sequential read speeds and somewhat faster random read speeds, so a newer HDD would have probably raised the "defrag improvement" a little bit as well if the CPU could keep up.
I used XP mainly because that's what was on the system. Also, I didn't want to mess with Superfetch (introduced in Vista). Superfetch keeps track of your program usage patterns over time and prefetches files to RAM based on the current clock time, so that the programs you use would hopefully be prefetched to RAM before you actually went to start them. My intention was to benchmark how well the defragmenters do their job and not how great Superfetch is. Working with the prefetcher was bad enough - I had to restart the test a couple of times until I found out how to get consistent results with the prefetcher! Basically, I found that I had to wait for the boot timer to expire before starting anything, so that various programs wouldn't get randomly tacked onto the boot sequence.
Keep in mind that different fragmentation patterns would result in different defragmenter performance, particularly in the "Defragmentation time" column. For instance, if most of the files on the disk had a few fragments and the largest freespace area was tiny, programs like Auslogics or SmartDefrag that defragmented very fast in this test would have taken much longer than they did.

Preparing the test system
First of all, I disabled the built-in Windows Defragmenter automatic defragmentation feature, because I didn't want it skewing the results by changing the disk layout during the test or slowing things down by randomly starting during a test. Interestingly, that feature seems to be not working (or disabled somehow) in many of the crapped out computers that I have come across.
Next, after installing the programs that I wanted to include in my "launch" test, I ran all of them one by one several times to ensure that the prefetch data was up-to-date so that any defragmenters "so inclined" would have good file usage information. I also ran "rundll32.exe advapi32.dll,ProcessIdleTasks" to force an update of the Layout.ini file. Perhaps I should have left that for an intelligent defragmenter to do, but I was afraid that my experimental defragmenter would be the only one that knew about this trick. Then I shutdown the system and imaged the system disk. After imaging the disk, I booted the system back up and ran the "original system" test.

Test methodology
After restoring the system image, I would install a defragmenter. Then I would launch it and select just the system disk (the backup image disk was also present) and click the Defragment button, starting a stopwatch at the same time. I did not change any settings in any of the programs, I just used the "defaults" and "automatics." My assumption is that if the programmers know how to write a good "Automatic" function and set good defaults, they probably also know how to write good defragmentation software. After defragmentation was complete, I would launch my defragmenter (a standalone application) and scan the disk, noting the fragmentation statistics and taking a screenshot of the disk layout. My defragmenter gives a detailed disk map like JkDefrag, really because it has nothing to hide.
Then I would reboot the computer. I would start the stopwatch as soon as Windows actually started loading; this being the instant all the HDD activity would start after the "Select OS" countdown reached zero (the computer had a HDD LED, another reason I selected it). Login was set to automatic. I would stop the stopwatch as soon as the desktop icons appeared. In the course of each test, the computer was rebooted 7 times. The average of these times make the "Desktop" column in my benchmark chart.
Also, I wrote a little program called "Prefetch Watcher" that would watch the prefetch folder and let me know when the boot timer expired so that I could safely start launching programs. When launched, that program also would watch CPU usage and RAM fluctuation and report the exact time that the activity stopped (it would wait for up to 5 seconds for any more activity). That number makes the "Idle" column. Thus the "Idle" time is the boot time plus the amount of time it took to finish launching services and startup programs. I kept them separate, because with several defragmenters, the desktop was quite responsive even while the startup programs were still loading (I only tried this during some of my aborted tests).
I divided my list of programs down to 2 or 3 programs to try per boot. This is because of file-caching. Many programs share the use of certain system files. If I ran them all one after the other, the later programs would launch much quicker than if they were launched first, because many of these shared files would have been used and cached by an earlier program. So I selected programs that I thought would have minimal impact on each other's launch speeds, and would run them 2 or 3 per boot, and then reboot the computer to clear the cache and try some of the other programs. During normal operation, the cache seems to forget some of these files over time anyway, so running a bunch of programs one after another would be an unnatural test (and, I wanted to test the HDD's performance, not the superb abilities of the Windows file cacher).
Upon launching a program from the desktop, I would start the stopwatch. When the program finished loading (document appear/music or video start playing/whatever was applicable for that program) I would stop the stopwatch. I always had the programs that were document or media oriented opening some file. Each of those times were compared to the original system's performance times and converted (by computer) to "times original" (i.e. this program started 1.67x as fast on the defragmented system). These numbers were averaged together for the 12 programs, and that's what you'll see in the "Launch" column.
Now, after running through the whole launch speed test regimen, I would then reboot the computer and start the entire test over again, replacing the numbers I got the first time. This is because the prefetch files seem to need updating after defragmentation. During the launch test, they got updated as I started each program, but the program's launch speeds would only reflect final performance when they were launched the second time, after the new prefetch files were written. Many of the programs launched noticeably faster during the second round of testing than they did the first time around.

Defragmenter finished disk layouts

Defragmenter feature chart
Frequently, when we read about the "features" a defragmenter has, we don't see performance features listed. Rather, we see things like "screensaver", "scheduling", "automatic defrag" and "frag guard." Personally, I don't care how many "features" (bells and whistles) a defragmenter has if it doesn't do its job: Improving the system's performance. So let's see how the popular defragmenters compare when testing for real performance improving features. I created the following chart by studying the diskmaps (and a couple of the statistics) generated as mentioned earlier. Since this chart was human-generated, it probably isn't perfect, but it should give you a pretty good idea:



Defragmenter benchmark results
Surprise, surprise - generally speaking, the defragmenters that actually had the more meaningful features also got the best performance in an actual system! See the benchmark results below:



Score: An arbitrary number generated for list sorting purposes. You should be able to generate a similar number by calculating Desktop * Idle * Shutdown * Launch * Launch * Search * 10.
Desktop: The amount of time it took for Windows to boot and display the desktop icons. Sampled and averaged 7 times for each defragmenter.
Idle: The amount of time it took for Windows to boot and fully load all the startup programs and services. Also a 7 sample average.
Shutdown: The amount of time it took for Windows to shutdown, measuring from when I hit the [R] key (restart) on the shutdown dialog to when the screen went black. Another 7 sample average.
Launch: The average speed that each test program launched compared to the original system. This average is made of 12 programs launch speeds (not times).
Search: The amount of time it took Disk Explorer (another program I wrote, Windows Explorer searches are too silly slow) to do an offline (booted from another system) file search of the HDD for files with "readme" in the name. This test measures the efficiency of the rapid seeking between the MFT and B-tree structures as well as the recursive performance of the B-tree structures. Any time a file is opened, the NTFS driver has to find it; and the prefetcher only helps out for the first 10 seconds of a program's launch cycle. The better this number, the more snappy the computer will be, the faster your searches will go, the faster folders will open, and the faster certain programs (especially games with lots of little resouce files) will load. Incidentally, after performing the search once, if I repeated the search, it would take about 2 seconds, regardless of the disk layout. The marvelous Windows file cache in operation! Of course, that would degrade after the cache got purged; working on something else for a while would do this normally.
Dfrg time: The amount of time the defragmenter being tested took to defragment C:\, starting when I clicked the [Defrag] button and ending when the program finished. I won't say that faster is better, because the faster defragmenters generally didn't do a very good job.
File frags: The total number of fragments in all files after defragmentation, as measured by my defragmenter.
Free frags: The number of areas with free space. The more of these there are, the higher the chance of fragmentation reoccurring quickly. Notice how Auslogics, Smart Defrag, and UltraDefrag greatly increased this number. Those programs will be addictive, requiring frequent defragmentation just to maintain performance; otherwise, massive fragmentation will occur over time, reducing performance and causing the phenomenon mentioned next.
Deep frags: The number of extra MFT records needed to store all the file "extent" information for extremely fragmented files. Unfortunately, once they're created, they will never be removed until those file(s) are deleted. Notice how several of the better defragmenters fragmented certain files so badly during defragmentation that the number has noticeably increased over the original system. These additional records will impose one more disk read per record, so they are not a good thing.

Of course, I had to include the silly (but strangely popular on the Internet) "clear the prefetch folder" tweak just to honestly show how much performance you can gain by clearing out the prefetch folder.
wink.gif
Ditto for disabling the prefetcher, except on SSD based systems-although that would make a good test, if somebody else wants to test that! While talking about the prefetcher, this is worth mentioning: The prefetcher appears to totally change how files are accessed by Windows. Any good defragmenter will have to work with the prefetcher as far as disk layout is concerned if it is to maximize performance.

FYI, I did not include my experimental defragmenter in these charts because it's not really ready for mass use and I didn't want to be clobbered by a bunch of requests for it. It did make #1 position on the charts, though.

I hope that you all find these charts informative!
thumb.gif
Remember that these charts and tests are for the system disk (C:\). Disk layout isn't quite so important for data disks. Long as the MFT and B-tree structures are together at the beginning of the partition, and the file and freespace fragments removed, most of the performance concerns for data disks will have been addressed.
 

·
Questionnaire galore!
Joined
·
6,134 Posts
Conclusion? Use MyDefrag? Or W7 default?
 

·
Windows Wrangler
Joined
·
2,288 Posts
Discussion Starter · #5 ·
Quote:
Originally Posted by Ferrari8608 View Post

You should make this an article.
I could make a summary article that links to this one if you think that's a good idea.
smile.gif


Quote:
Originally Posted by Speedster159 View Post

Conclusion? Use MyDefrag? Or W7 default?
I should have anticipated this and tested the Windows 7 defragmenter as well. I guess I'll upgrade the XP test system to Windows 7 trial, re-image, and test Windows 7's defragmenter, MyDefrag and Auslogics defrag on it just for reference.

Quote:
Originally Posted by Otterclock View Post

I've always wondered if there was a difference between defrag tools. The education was super appreciated. Great work. Defragging with MyDefrag as I type this.
I hope that Windows' automatic defrag feature doesn't undo MyDefrag's work! I would disable the Disk Defragmenter service (if you have it) and set "HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Dfrg\BootOptimizeFunction\Enable" to "N" just to make sure. Create a new string value named "Enable" if one doesn't exist already.
thumb.gif
 

·
Questionnaire galore!
Joined
·
6,134 Posts
Quote:
Originally Posted by Techie007 View Post

Quote:
Originally Posted by Ferrari8608 View Post

You should make this an article.
I could make a summary article that links to this one if you think that's a good idea.
smile.gif


Quote:
Originally Posted by Speedster159 View Post

Conclusion? Use MyDefrag? Or W7 default?
I should have anticipated this and tested the Windows 7 defragmenter as well. I guess I'll upgrade the XP test system to Windows 7 trial, re-image, and test Windows 7's defragmenter, MyDefrag and Auslogics defrag on it just for reference.

Quote:
Originally Posted by Otterclock View Post

I've always wondered if there was a difference between defrag tools. The education was super appreciated. Great work. Defragging with MyDefrag as I type this.
I hope that Windows' automatic defrag feature doesn't undo MyDefrag's work! I would disable the Disk Defragmenter service (if you have it) and set "HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Dfrg\BootOptimizeFunction\Enable" to "N" just to make sure.
thumb.gif
I can't find it? o.0 Nothing named "Enable" on mine.



EDIT Any estimate on when we could get the results for Windows 7?

Quote:
Originally Posted by ixsis View Post

Quote:
Originally Posted by Speedster159 View Post

Conclusion? Use MyDefrag? Or W7 default?
No way to answer than since he based his test on a default OS defragger that is 12 years old.

I posted this originally as the OP was updating stating that he is going to test with W7. Looking forward to those results.
I do look forward to that.. But for now i will stay with my last Auslogics defragementation and keep Playing The Sims 3 which access a butt load of textures and other stuff as you play it.
 

·
Windows Wrangler
Joined
·
2,288 Posts
Discussion Starter · #8 ·
@Speedster159:

I have updated the registry instructions in my earlier post.

Windows 7 wouldn't do an upgrade install from XP, so I just finished a fresh install, which should be better for the test results anyway. That means that I'll have to reinstall all the test programs, and that the program's fragments will be different, which will totally change the test results. Give me a week or two.

Auslogics defrag (or SmartDefrag, or Defraggler) won't become better/smarter defragmenters overnight just because I switched over to Windows 7. They scored poorly due to their programmers' lack of knowledge and testing on disk performance. I mean, the mess is very obvious if you look at the resulting diskmap, which is why most defragmenters have such blocky diskmaps: They are hiding their bad job.
thumb.gif
 

·
Questionnaire galore!
Joined
·
6,134 Posts
Well then. I'm going to defragment using Win7 for now since you said Asu was horrible

Imma leave it now while I sleep... :|
 

·
Questionnaire galore!
Joined
·
6,134 Posts
Bump.
 

·
Windows Wrangler
Joined
·
2,288 Posts
Discussion Starter · #11 ·
Ah, you're perfectly on time!

I could not upgrade XP to 7, so I had to do a fresh install (which is better for the test anyway, but takes me longer). I had just about finished setting up the Windows 7 test system and then ReadyBoot quit working, leaving Windows 7 with no boot-time prefetching. It now takes over 2 minutes to boot (previously, it was just over a minute). Still, both times are slower than XP (which is what I expected
wink.gif
). As you can see with the chart in my first post, Windows XP properly defragmented took about 22 seconds to boot the same computer by comparison.
Then, several days ago, a very close out-of-the-blue lightning bolt knocked out two surge protected computers here (motherboards fried; PSUs, RAM, HDDs and CPUs perfectly fine on both) and I've been busy making repairs/replacements and setting things up. I hope to figure out what's wrong with ReadyBoot (not to be confused with ReadyBoost) and get going with the test soon!
smile.gif
 

·
I took this profile pic
Joined
·
4,015 Posts
I know you posted this a while ago but it just became relevant for my situation. I was going to download mydefrag but the problem is that their website has not been updated since 2009. I am running a Win8 laptop and because it hasn't been updated since 2009, there is no Win8 support (I know Win8 is basically Win7). Also, on my other computer I have an SSD for my OS and an HDD for everything else. How will this program function in that environment since my OS files will not be subject to defragging?
 

·
Windows Wrangler
Joined
·
2,288 Posts
Discussion Starter · #14 ·
As far as I can tell in my virtual machine, MyDefrag works just as well in Windows 8.1 as it does in Windows 7, so I wouldn't worry about its age. The biggest change over time really has been the increasing popularity of SSDs. The NTFS filesystem is basically the same, and so are HDD access performance patterns. I think it is sad that of all the defragmenters on the market, a four year old free program is still by far the best, and that the very popular Auslogics defragmenter is one of the worst available (being very addictive, arranging the files in such a way that the disk quickly refragments).
In most cases, I honestly don't think defragmenting is going to be worth the effort when using any other defragmenter than MyDefrag. If you don't want to use that program, I recommend using the built-in Windows defragmenter since all the "runner-ups" to MyDefrag are payware.

As far as your computer with the SSD and HDD, you can use MyDefrag on the HDD no problem; just tell it to use the Data Disk Monthly script instead of the System Disk Monthly script. And I don't think you need to defragment your data disks frequently (as in even once a month), since the performance doesn't start degrading that much until the files are in a big mess (the nice, detailed disk map that MyDefrag shows can let you get a good idea of how tidy the drive is). Now, if you have programs or games installed on the HDD, that makes it a "kind of" system drive, and you might want to use the System Disk Monthly script in it (after all) so that MyDefrag will place your program and game files at the beginning of the drive for faster access.
thumb.gif
 

·
I took this profile pic
Joined
·
4,015 Posts
Quote:
Originally Posted by Techie007 View Post

As far as I can tell in my virtual machine, MyDefrag works just as well in Windows 8.1 as it does in Windows 7, so I wouldn't worry about its age. The biggest change over time really has been the increasing popularity of SSDs. The NTFS filesystem is basically the same, and so are HDD access performance patterns. I think it is sad that of all the defragmenters on the market, a four year old free program is still by far the best, and that the very popular Auslogics defragmenter is one of the worst available (being very addictive, arranging the files in such a way that the disk quickly refragments).
In most cases, I honestly don't think defragmenting is going to be worth the effort when using any other defragmenter than MyDefrag. If you don't want to use that program, I recommend using the built-in Windows defragmenter since all the "runner-ups" to MyDefrag are payware.

As far as your computer with the SSD and HDD, you can use MyDefrag on the HDD no problem; just tell it to use the Data Disk Monthly script instead of the System Disk Monthly script. And I don't think you need to defragment your data disks frequently (as in even once a month), since the performance doesn't start degrading that much until the files are in a big mess (the nice, detailed disk map that MyDefrag shows can let you get a good idea of how tidy the drive is). Now, if you have programs or games installed on the HDD, that makes it a "kind of" system drive, and you might want to use the System Disk Monthly script in it (after all) so that MyDefrag will place your program and game files at the beginning of the drive for faster access.
thumb.gif
+1 rep

Thanks for the advice. I think I will stick with the built in defrag on both systems because the SSD has my OS on it and the HDD only gets a bit fragmented from time to time.

By the way, is it true that having my OS work from my SSD also makes my HDD work better or is that just the placebo effect? When I upgraded to the SSD, I noticed that using the HDD only for the task at hand (mainly with game load times) made it run faster and made it less likely to become fragmented. Maybe it's just me but there might be something to that you know?
 

·
Windows Wrangler
Joined
·
2,288 Posts
Discussion Starter · #16 ·
Quote:
Originally Posted by Thready View Post

By the way, is it true that having my OS work from my SSD also makes my HDD work better or is that just the placebo effect? When I upgraded to the SSD, I noticed that using the HDD only for the task at hand (mainly with game load times) made it run faster and made it less likely to become fragmented. Maybe it's just me but there might be something to that you know?
Yes, there is definitely something to that, especially if you have the pagefile on the SSD too. For one, the OS is comprised of many small files, some of which get replaced from time to time by Windows Update. Both of those (small files + replacing files) greatly increase the likelihood of fragmentation occurring. Of course, that fragmentation is still happening on the SSD, although it affects performance much less because of the extremely fast random read speeds of SSDs. Also, since HDDs have very slow random read speeds, the HDD appears faster because it is no longer doing three things at a time (reading various OS files, paging memory to the disk, and loading your games), and can now do more sequential reads, where its data rate can approach 1/4-1/2 of a SSD's data rate (as opposed to 1/100-1/50 when reading randomly).
 

·
Registered
Joined
·
1,481 Posts
I like what I call Hardware Accelerated defragmenting the best. (Simply an extra HDD)
Move files from Drive A to Drive B, quick-format Drive A, Move files back from Drive B to Drive A.
Result: 0 Fragmented files, 0 Fragmented free space.

I do it all manually, and it gives me better (and noticeably quicker) results than Defraggler does. I actually like defraggler for drives that cannot be reformatted, like system drives, but perhaps it's a piece of garbage that's unnecessarily slow. Who knows. If Defraggler isn't a piece of garbage (Which I doubt), then my method will probably beat any software single-disk solution out there if you have access to multiple drives.

I don't know any programming language, but I bet it'd be easy to program something to do this, for example in something humans can understand but a computer would not.
-Copy all of Drive A to Drive B
-When transfer is complete, reformat (Quick-Format Option / other Settings) Drive A
-When Reformat of Drive A is complete -> Move contents of Drive B back to Drive A

You have the option of simply avoiding step 2 as well, leaving you with a defragmented copy of your files and you can instead change the drive letters around to make your computer act as if Drive B magically became Drive A unfragmented, if you have two of the same sized/performance drives. And a fragmented backup!, that would be cycled around with reformats repeating the cycle making your backups and defragments condensed into one step.
 

·
Windows Wrangler
Joined
·
2,288 Posts
Discussion Starter · #18 ·
Quote:
Originally Posted by Shadow11377 View Post

I like what I call Hardware Accelerated defragmenting the best. (Simply an extra HDD)
Move files from Drive A to Drive B, quick-format Drive A, Move files back from Drive B to Drive A.
Result: 0 Fragmented files, 0 Fragmented free space.

I do it all manually, and it gives me better (and noticeably quicker) results than Defraggler does. I actually like defraggler for drives that cannot be reformatted, like system drives, but perhaps it's a piece of garbage that's unnecessarily slow. Who knows. If Defraggler isn't a piece of garbage (Which I doubt), then my method will probably beat any software single-disk solution out there if you have access to multiple drives.

I don't know any programming language, but I bet it'd be easy to program something to do this, for example in something humans can understand but a computer would not.
-Copy all of Drive A to Drive B
-When transfer is complete, reformat (Quick-Format Option / other Settings) Drive A
-When Reformat of Drive A is complete -> Move contents of Drive B back to Drive A

You have the option of simply avoiding step 2 as well, leaving you with a defragmented copy of your files and you can instead change the drive letters around to make your computer act as if Drive B magically became Drive A unfragmented, if you have two of the same sized/performance drives. And a fragmented backup!, that would be cycled around with reformats repeating the cycle making your backups and defragments condensed into one step.

Interesting idea. I like it when people think outside of the box and come up with creative ideas like this!
thumb.gif


That said, there are some very good reasons why this would not give best performance:

Pros:
  1. Removes "deep" fragments (no defragmenter currently on the market does this).
  2. Creates a sorted MFT (no defragmenter currently on the market can do this).
  3. An automatic backup occurs during such defragmentation.
  4. Absolutely no risk of file loss (or corruption) on the original disk if power is lost during defragmentation.

Cons:
  1. Requires an extra HDD.
  2. Unless special software is used, the files will not be sorted by category or usage pattern.
  3. Unless special software is used, it would destroy NTFS file permissions and hard links (think about the User profile folders).
  4. Cannot result in zero file fragments, since many folder B-tree entry files will have to be expanded beyond their original starting point after more files get written to those folders, resulting in their fragmentation.
  5. Cannot result in zero freespace fragments, since the NTFS filesystem does not operate this way on its own.
  6. The B-tree files will be scattered throughout the file area instead of next to the MFT and each other, resulting in slow searches.
  7. Regardless of the fragmentation level, it will always take the "full" time to defragment. However, its full time will be shorter than the full time for "on disk" defragmentation, because of the reduced seeking required.
  8. Would not be bootable unless other software was used to copy the boot sector or write a new one.
  9. Would remove NTFS bad-sector mapping set by chkdsk on the original HDD when it was formatted. Of course, such HDDs should be replaced anyway.

I recently "cloned" a HDD in order to upgrade it, and the software I used (Farstone DriveClone) actually did this (copy files and folders) instead of copying disk sectors. It even copied all the permissions and set up the hard links just as they were on the original disk. Needless to say, I was impressed. When it was done, there were thousands of fragments on the destination HDD, and the system ran slower in spite of the fact that the new HDD was faster than the old one. Defragmenting it with my experimental defragmenter restored the system's usual snappiness.

I still have the image of the test system I used earlier laying around, and I may test this sometime just for fun to see how it rates. If/when I do, I will add it to the charts in my first post.
smile.gif
 

·
Windows Wrangler
Joined
·
2,288 Posts
Discussion Starter · #20 ·
Yes, MyDefrag actually comes with a script that will move all files to the end of the drive. The script is included in its "Example Scripts" folder. To make that script appear and be selectable in the MyDefrag program, open Windows Explorer and navigate to "C:\Program Files (x86)\MyDefrag v4.3.1\Example Scripts" and copy the "MoveToEndOfDisk.MyD" file to the "Scripts" folder. When you start MyDefrag, that script will be in its list of scripts, named "Move to end of disk".

I'm curious, why would you want to do this?
 
1 - 20 of 37 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top