Introduction
Several years ago, I was first alerted to the fact that defragmenters vary in the improvement that they bring to a system. For instance, back in the XP days, I noticed that BootVis would always provide a significantly faster boot time than Norton SpeedDisk on my computer. Re-defragmenting with SpeedDisk would slow the speeds down again. I read on what BootVis does, and decided that SpeedDisk's lack of performance was due to an inferior file layout. At this point, I tried several defragenters on my system, including Diskeeper, PerfectDisk, UltimateDefrag, JkDefrag and Auslogics defrag; while PerfectDisk seemed to do the best, it still did not get boot times as fast as BootVis. So I changed some settings in SpeedDisk and gave it a list of 250 system files (in order) to place first. It promptly made a mess of my system, creating some 300,000 fragments and then locking up. Starting it again only made the problem worse. That's when I started looking into defragmentation APIs, and ended up writing my own defragmenter. I've been using it exclusively for a while now, but when I'm asked to recommend a defragmenter, I've always been at a loss, because I really didn't know how the other defragmenters compared as far as actual computer performance was concerned, and mine isn't really mature yet.
So I finally did this test (it took me about a month, working on it off and on), and I was surprised at how poorly the majority of the defragmenters did-and I only tested the popular ones! Then there's a list of not-so-popular defragmenters that's just about as long (I did do a quick virtual machine test on many of those, but none were really worth mentioning). I was also surprised at how high the built-in Windows XP defragmenter scored (however, Microsoft redid their defragmenter in Windows Vista, and I don't know if they made it better or worse).
Summary
If your system boots from a HDD, you should definitely keep that HDD defragmented. As of this writing (June 2013), MyDefrag appears the best available tool for that task.
Considerations
What really is the main objective of defragmentation? Improving your computer's responsiveness, boot speed and application launch speeds. Also, having a defragmented disk (SSDs included) will also greatly increase the chances of successful file recovery in the event of filesystem corruption or accidental file deletion. However, this should not replace a proper backup.
That said, let's explore in theory the three main things that affect the performance of your disks (and thus the responsiveness of your computer):
#1: Seek time
This is the amount of time it takes for the heads on the HDD to find another spot in order to read data that is physically stored in a different place then the previous read/write. SSDs and other flash media (e.g. SD cards) are not affected by this and are only affected by access time (see #3 later). On a HDD, the seek time is generally proportional to the seek distance. For a defragmenter, this means that that in addition to removing fragments so that the HDD doesn't need to seek multiple times to read a single file, files that are frequently accessed in a sequence should be placed in access sequence so that the whole group of files can be read sequentially. Very few defragmenters do this. All the defragmenters that made the top of my benchmark test did this to varying amounts.
Also, there is a technique called "short-stroking" that takes advantage of this phenomenon. If you create a small partition at the beginning of the disk and install Windows (and your programs) there, all the seeks for Windows (and your programs) will be confined to that area, thus reducing the maximum seek distance. However, a good defragmenter will get better system performance than short-stroking alone, so short-stroking your system HDD shouldn't be necessary.
#2: Read speed
HDDs transfer data approximately 2x faster at the beginning of the disk than at the end of the disk. This is due to the greater distance (and thus higher speed) that the outer edge of the HDD's platters have to travel compared to the inner edge. SSDs and other flash media are not affected by this either. For a defragmenter, this means that best performance on a HDD will be achieved by moving frequently accessed files to the beginning of the disk, and sorting out rarely used data caches and moving them preferably to the end of the disk, where they will be out of the way, allowing the other files to be closer to the beginning. Quite a few defragmenters I tested at least consolidated all the files to the beginning of the disk; but very few selected the frequently used system files to be first (again, those that did got the best ratings). Even fewer did a good job at detecting which files belonged at the end of the group. Only one (my experimental defragmenter) actually moved those files to the end of the disk (although Norton SpeedDisk used to).
#3: Access time
This one affects all storage media. This is the combined amount of time it takes for the smallest I/O request to go through the filesystem driver (usually NTFS), the partition driver, the disk driver, the motherboard's north bridge, be interpreted and responded to by the storage device, and then the response (data) to pass back up through that same chain. This is why one large read will transfer data faster than a lot of short reads for the same amount of data. To read a file, at least one read will have to be made for each fragment. If there are many (hundreds or thousands of) small fragments in a file, this will impact performance, even on a SSD, because of all the short reads that will be done. All the defragmenters were able to remedy this (although UltraDefrag really did a bad job). However, several (including Auslogics, SmartDefrag and UltraDefrag) created many free-space fragments in the process. Unfortunately, this means that the file fragmentation would come back quickly and with vengeance in a short period of time, making the system dependent on regular defragmentation to keep the speed from lapsing to speeds slower than the original system.
The test system
Preparing the test system
Test methodology
Defragmenter finished disk layouts
Defragmenter feature chart
Frequently, when we read about the "features" a defragmenter has, we don't see performance features listed. Rather, we see things like "screensaver", "scheduling", "automatic defrag" and "frag guard." Personally, I don't care how many "features" (bells and whistles) a defragmenter has if it doesn't do its job: Improving the system's performance. So let's see how the popular defragmenters compare when testing for real performance improving features. I created the following chart by studying the diskmaps (and a couple of the statistics) generated as mentioned earlier. Since this chart was human-generated, it probably isn't perfect, but it should give you a pretty good idea:
Defragmenter benchmark results
Surprise, surprise - generally speaking, the defragmenters that actually had the more meaningful features also got the best performance in an actual system! See the benchmark results below:
• Score: An arbitrary number generated for list sorting purposes. You should be able to generate a similar number by calculating Desktop * Idle * Shutdown * Launch * Launch * Search * 10.
• Desktop: The amount of time it took for Windows to boot and display the desktop icons. Sampled and averaged 7 times for each defragmenter.
• Idle: The amount of time it took for Windows to boot and fully load all the startup programs and services. Also a 7 sample average.
• Shutdown: The amount of time it took for Windows to shutdown, measuring from when I hit the [R] key (restart) on the shutdown dialog to when the screen went black. Another 7 sample average.
• Launch: The average speed that each test program launched compared to the original system. This average is made of 12 programs launch speeds (not times).
• Search: The amount of time it took Disk Explorer (another program I wrote, Windows Explorer searches are too silly slow) to do an offline (booted from another system) file search of the HDD for files with "readme" in the name. This test measures the efficiency of the rapid seeking between the MFT and B-tree structures as well as the recursive performance of the B-tree structures. Any time a file is opened, the NTFS driver has to find it; and the prefetcher only helps out for the first 10 seconds of a program's launch cycle. The better this number, the more snappy the computer will be, the faster your searches will go, the faster folders will open, and the faster certain programs (especially games with lots of little resouce files) will load. Incidentally, after performing the search once, if I repeated the search, it would take about 2 seconds, regardless of the disk layout. The marvelous Windows file cache in operation! Of course, that would degrade after the cache got purged; working on something else for a while would do this normally.
• Dfrg time: The amount of time the defragmenter being tested took to defragment C:\, starting when I clicked the [Defrag] button and ending when the program finished. I won't say that faster is better, because the faster defragmenters generally didn't do a very good job.
• File frags: The total number of fragments in all files after defragmentation, as measured by my defragmenter.
• Free frags: The number of areas with free space. The more of these there are, the higher the chance of fragmentation reoccurring quickly. Notice how Auslogics, Smart Defrag, and UltraDefrag greatly increased this number. Those programs will be addictive, requiring frequent defragmentation just to maintain performance; otherwise, massive fragmentation will occur over time, reducing performance and causing the phenomenon mentioned next.
• Deep frags: The number of extra MFT records needed to store all the file "extent" information for extremely fragmented files. Unfortunately, once they're created, they will never be removed until those file(s) are deleted. Notice how several of the better defragmenters fragmented certain files so badly during defragmentation that the number has noticeably increased over the original system. These additional records will impose one more disk read per record, so they are not a good thing.
Of course, I had to include the silly (but strangely popular on the Internet) "clear the prefetch folder" tweak just to honestly show how much performance you can gain by clearing out the prefetch folder.
Ditto for disabling the prefetcher, except on SSD based systems-although that would make a good test, if somebody else wants to test that! While talking about the prefetcher, this is worth mentioning: The prefetcher appears to totally change how files are accessed by Windows. Any good defragmenter will have to work with the prefetcher as far as disk layout is concerned if it is to maximize performance.
FYI, I did not include my experimental defragmenter in these charts because it's not really ready for mass use and I didn't want to be clobbered by a bunch of requests for it. It did make #1 position on the charts, though.
I hope that you all find these charts informative!
Remember that these charts and tests are for the system disk (C:\). Disk layout isn't quite so important for data disks. Long as the MFT and B-tree structures are together at the beginning of the partition, and the file and freespace fragments removed, most of the performance concerns for data disks will have been addressed.
Several years ago, I was first alerted to the fact that defragmenters vary in the improvement that they bring to a system. For instance, back in the XP days, I noticed that BootVis would always provide a significantly faster boot time than Norton SpeedDisk on my computer. Re-defragmenting with SpeedDisk would slow the speeds down again. I read on what BootVis does, and decided that SpeedDisk's lack of performance was due to an inferior file layout. At this point, I tried several defragenters on my system, including Diskeeper, PerfectDisk, UltimateDefrag, JkDefrag and Auslogics defrag; while PerfectDisk seemed to do the best, it still did not get boot times as fast as BootVis. So I changed some settings in SpeedDisk and gave it a list of 250 system files (in order) to place first. It promptly made a mess of my system, creating some 300,000 fragments and then locking up. Starting it again only made the problem worse. That's when I started looking into defragmentation APIs, and ended up writing my own defragmenter. I've been using it exclusively for a while now, but when I'm asked to recommend a defragmenter, I've always been at a loss, because I really didn't know how the other defragmenters compared as far as actual computer performance was concerned, and mine isn't really mature yet.
So I finally did this test (it took me about a month, working on it off and on), and I was surprised at how poorly the majority of the defragmenters did-and I only tested the popular ones! Then there's a list of not-so-popular defragmenters that's just about as long (I did do a quick virtual machine test on many of those, but none were really worth mentioning). I was also surprised at how high the built-in Windows XP defragmenter scored (however, Microsoft redid their defragmenter in Windows Vista, and I don't know if they made it better or worse).
Summary
If your system boots from a HDD, you should definitely keep that HDD defragmented. As of this writing (June 2013), MyDefrag appears the best available tool for that task.
Considerations
What really is the main objective of defragmentation? Improving your computer's responsiveness, boot speed and application launch speeds. Also, having a defragmented disk (SSDs included) will also greatly increase the chances of successful file recovery in the event of filesystem corruption or accidental file deletion. However, this should not replace a proper backup.
That said, let's explore in theory the three main things that affect the performance of your disks (and thus the responsiveness of your computer):
#1: Seek time
This is the amount of time it takes for the heads on the HDD to find another spot in order to read data that is physically stored in a different place then the previous read/write. SSDs and other flash media (e.g. SD cards) are not affected by this and are only affected by access time (see #3 later). On a HDD, the seek time is generally proportional to the seek distance. For a defragmenter, this means that that in addition to removing fragments so that the HDD doesn't need to seek multiple times to read a single file, files that are frequently accessed in a sequence should be placed in access sequence so that the whole group of files can be read sequentially. Very few defragmenters do this. All the defragmenters that made the top of my benchmark test did this to varying amounts.
Also, there is a technique called "short-stroking" that takes advantage of this phenomenon. If you create a small partition at the beginning of the disk and install Windows (and your programs) there, all the seeks for Windows (and your programs) will be confined to that area, thus reducing the maximum seek distance. However, a good defragmenter will get better system performance than short-stroking alone, so short-stroking your system HDD shouldn't be necessary.
#2: Read speed
HDDs transfer data approximately 2x faster at the beginning of the disk than at the end of the disk. This is due to the greater distance (and thus higher speed) that the outer edge of the HDD's platters have to travel compared to the inner edge. SSDs and other flash media are not affected by this either. For a defragmenter, this means that best performance on a HDD will be achieved by moving frequently accessed files to the beginning of the disk, and sorting out rarely used data caches and moving them preferably to the end of the disk, where they will be out of the way, allowing the other files to be closer to the beginning. Quite a few defragmenters I tested at least consolidated all the files to the beginning of the disk; but very few selected the frequently used system files to be first (again, those that did got the best ratings). Even fewer did a good job at detecting which files belonged at the end of the group. Only one (my experimental defragmenter) actually moved those files to the end of the disk (although Norton SpeedDisk used to).
#3: Access time
This one affects all storage media. This is the combined amount of time it takes for the smallest I/O request to go through the filesystem driver (usually NTFS), the partition driver, the disk driver, the motherboard's north bridge, be interpreted and responded to by the storage device, and then the response (data) to pass back up through that same chain. This is why one large read will transfer data faster than a lot of short reads for the same amount of data. To read a file, at least one read will have to be made for each fragment. If there are many (hundreds or thousands of) small fragments in a file, this will impact performance, even on a SSD, because of all the short reads that will be done. All the defragmenters were able to remedy this (although UltraDefrag really did a bad job). However, several (including Auslogics, SmartDefrag and UltraDefrag) created many free-space fragments in the process. Unfortunately, this means that the file fragmentation would come back quickly and with vengeance in a short period of time, making the system dependent on regular defragmentation to keep the speed from lapsing to speeds slower than the original system.
The test system
So, let's see how the defragmenters performed in a real system. The test system was a Pentium 4 at 2.4 GHz with an 80 GB HDD and 1 GB of RAM running Windows XP.
I used a Pentium 4 computer mainly because I had several of them laying around. If I had used a faster CPU, the "defrag improvement" would have been a little higher, because at times, several of the programs were "CPU bound" when starting.
I used an 80 GB HDD because I had two of them (one for the backup image) and because a larger HDD would take longer to image. Many newer HDDs have faster sequential read speeds and somewhat faster random read speeds, so a newer HDD would have probably raised the "defrag improvement" a little bit as well if the CPU could keep up.
I used XP mainly because that's what was on the system. Also, I didn't want to mess with Superfetch (introduced in Vista). Superfetch keeps track of your program usage patterns over time and prefetches files to RAM based on the current clock time, so that the programs you use would hopefully be prefetched to RAM before you actually went to start them. My intention was to benchmark how well the defragmenters do their job and not how great Superfetch is. Working with the prefetcher was bad enough - I had to restart the test a couple of times until I found out how to get consistent results with the prefetcher! Basically, I found that I had to wait for the boot timer to expire before starting anything, so that various programs wouldn't get randomly tacked onto the boot sequence.
Keep in mind that different fragmentation patterns would result in different defragmenter performance, particularly in the "Defragmentation time" column. For instance, if most of the files on the disk had a few fragments and the largest freespace area was tiny, programs like Auslogics or SmartDefrag that defragmented very fast in this test would have taken much longer than they did.
I used a Pentium 4 computer mainly because I had several of them laying around. If I had used a faster CPU, the "defrag improvement" would have been a little higher, because at times, several of the programs were "CPU bound" when starting.
I used an 80 GB HDD because I had two of them (one for the backup image) and because a larger HDD would take longer to image. Many newer HDDs have faster sequential read speeds and somewhat faster random read speeds, so a newer HDD would have probably raised the "defrag improvement" a little bit as well if the CPU could keep up.
I used XP mainly because that's what was on the system. Also, I didn't want to mess with Superfetch (introduced in Vista). Superfetch keeps track of your program usage patterns over time and prefetches files to RAM based on the current clock time, so that the programs you use would hopefully be prefetched to RAM before you actually went to start them. My intention was to benchmark how well the defragmenters do their job and not how great Superfetch is. Working with the prefetcher was bad enough - I had to restart the test a couple of times until I found out how to get consistent results with the prefetcher! Basically, I found that I had to wait for the boot timer to expire before starting anything, so that various programs wouldn't get randomly tacked onto the boot sequence.
Keep in mind that different fragmentation patterns would result in different defragmenter performance, particularly in the "Defragmentation time" column. For instance, if most of the files on the disk had a few fragments and the largest freespace area was tiny, programs like Auslogics or SmartDefrag that defragmented very fast in this test would have taken much longer than they did.
Preparing the test system
First of all, I disabled the built-in Windows Defragmenter automatic defragmentation feature, because I didn't want it skewing the results by changing the disk layout during the test or slowing things down by randomly starting during a test. Interestingly, that feature seems to be not working (or disabled somehow) in many of the crapped out computers that I have come across.
Next, after installing the programs that I wanted to include in my "launch" test, I ran all of them one by one several times to ensure that the prefetch data was up-to-date so that any defragmenters "so inclined" would have good file usage information. I also ran "rundll32.exe advapi32.dll,ProcessIdleTasks" to force an update of the Layout.ini file. Perhaps I should have left that for an intelligent defragmenter to do, but I was afraid that my experimental defragmenter would be the only one that knew about this trick. Then I shutdown the system and imaged the system disk. After imaging the disk, I booted the system back up and ran the "original system" test.
Next, after installing the programs that I wanted to include in my "launch" test, I ran all of them one by one several times to ensure that the prefetch data was up-to-date so that any defragmenters "so inclined" would have good file usage information. I also ran "rundll32.exe advapi32.dll,ProcessIdleTasks" to force an update of the Layout.ini file. Perhaps I should have left that for an intelligent defragmenter to do, but I was afraid that my experimental defragmenter would be the only one that knew about this trick. Then I shutdown the system and imaged the system disk. After imaging the disk, I booted the system back up and ran the "original system" test.
Test methodology
After restoring the system image, I would install a defragmenter. Then I would launch it and select just the system disk (the backup image disk was also present) and click the Defragment button, starting a stopwatch at the same time. I did not change any settings in any of the programs, I just used the "defaults" and "automatics." My assumption is that if the programmers know how to write a good "Automatic" function and set good defaults, they probably also know how to write good defragmentation software. After defragmentation was complete, I would launch my defragmenter (a standalone application) and scan the disk, noting the fragmentation statistics and taking a screenshot of the disk layout. My defragmenter gives a detailed disk map like JkDefrag, really because it has nothing to hide.
Then I would reboot the computer. I would start the stopwatch as soon as Windows actually started loading; this being the instant all the HDD activity would start after the "Select OS" countdown reached zero (the computer had a HDD LED, another reason I selected it). Login was set to automatic. I would stop the stopwatch as soon as the desktop icons appeared. In the course of each test, the computer was rebooted 7 times. The average of these times make the "Desktop" column in my benchmark chart.
Also, I wrote a little program called "Prefetch Watcher" that would watch the prefetch folder and let me know when the boot timer expired so that I could safely start launching programs. When launched, that program also would watch CPU usage and RAM fluctuation and report the exact time that the activity stopped (it would wait for up to 5 seconds for any more activity). That number makes the "Idle" column. Thus the "Idle" time is the boot time plus the amount of time it took to finish launching services and startup programs. I kept them separate, because with several defragmenters, the desktop was quite responsive even while the startup programs were still loading (I only tried this during some of my aborted tests).
I divided my list of programs down to 2 or 3 programs to try per boot. This is because of file-caching. Many programs share the use of certain system files. If I ran them all one after the other, the later programs would launch much quicker than if they were launched first, because many of these shared files would have been used and cached by an earlier program. So I selected programs that I thought would have minimal impact on each other's launch speeds, and would run them 2 or 3 per boot, and then reboot the computer to clear the cache and try some of the other programs. During normal operation, the cache seems to forget some of these files over time anyway, so running a bunch of programs one after another would be an unnatural test (and, I wanted to test the HDD's performance, not the superb abilities of the Windows file cacher).
Upon launching a program from the desktop, I would start the stopwatch. When the program finished loading (document appear/music or video start playing/whatever was applicable for that program) I would stop the stopwatch. I always had the programs that were document or media oriented opening some file. Each of those times were compared to the original system's performance times and converted (by computer) to "times original" (i.e. this program started 1.67x as fast on the defragmented system). These numbers were averaged together for the 12 programs, and that's what you'll see in the "Launch" column.
Now, after running through the whole launch speed test regimen, I would then reboot the computer and start the entire test over again, replacing the numbers I got the first time. This is because the prefetch files seem to need updating after defragmentation. During the launch test, they got updated as I started each program, but the program's launch speeds would only reflect final performance when they were launched the second time, after the new prefetch files were written. Many of the programs launched noticeably faster during the second round of testing than they did the first time around.
Then I would reboot the computer. I would start the stopwatch as soon as Windows actually started loading; this being the instant all the HDD activity would start after the "Select OS" countdown reached zero (the computer had a HDD LED, another reason I selected it). Login was set to automatic. I would stop the stopwatch as soon as the desktop icons appeared. In the course of each test, the computer was rebooted 7 times. The average of these times make the "Desktop" column in my benchmark chart.
Also, I wrote a little program called "Prefetch Watcher" that would watch the prefetch folder and let me know when the boot timer expired so that I could safely start launching programs. When launched, that program also would watch CPU usage and RAM fluctuation and report the exact time that the activity stopped (it would wait for up to 5 seconds for any more activity). That number makes the "Idle" column. Thus the "Idle" time is the boot time plus the amount of time it took to finish launching services and startup programs. I kept them separate, because with several defragmenters, the desktop was quite responsive even while the startup programs were still loading (I only tried this during some of my aborted tests).
I divided my list of programs down to 2 or 3 programs to try per boot. This is because of file-caching. Many programs share the use of certain system files. If I ran them all one after the other, the later programs would launch much quicker than if they were launched first, because many of these shared files would have been used and cached by an earlier program. So I selected programs that I thought would have minimal impact on each other's launch speeds, and would run them 2 or 3 per boot, and then reboot the computer to clear the cache and try some of the other programs. During normal operation, the cache seems to forget some of these files over time anyway, so running a bunch of programs one after another would be an unnatural test (and, I wanted to test the HDD's performance, not the superb abilities of the Windows file cacher).
Upon launching a program from the desktop, I would start the stopwatch. When the program finished loading (document appear/music or video start playing/whatever was applicable for that program) I would stop the stopwatch. I always had the programs that were document or media oriented opening some file. Each of those times were compared to the original system's performance times and converted (by computer) to "times original" (i.e. this program started 1.67x as fast on the defragmented system). These numbers were averaged together for the 12 programs, and that's what you'll see in the "Launch" column.
Now, after running through the whole launch speed test regimen, I would then reboot the computer and start the entire test over again, replacing the numbers I got the first time. This is because the prefetch files seem to need updating after defragmentation. During the launch test, they got updated as I started each program, but the program's launch speeds would only reflect final performance when they were launched the second time, after the new prefetch files were written. Many of the programs launched noticeably faster during the second round of testing than they did the first time around.
Defragmenter finished disk layouts
Defragmenter feature chart
Frequently, when we read about the "features" a defragmenter has, we don't see performance features listed. Rather, we see things like "screensaver", "scheduling", "automatic defrag" and "frag guard." Personally, I don't care how many "features" (bells and whistles) a defragmenter has if it doesn't do its job: Improving the system's performance. So let's see how the popular defragmenters compare when testing for real performance improving features. I created the following chart by studying the diskmaps (and a couple of the statistics) generated as mentioned earlier. Since this chart was human-generated, it probably isn't perfect, but it should give you a pretty good idea:
Defragmenter benchmark results
Surprise, surprise - generally speaking, the defragmenters that actually had the more meaningful features also got the best performance in an actual system! See the benchmark results below:
• Score: An arbitrary number generated for list sorting purposes. You should be able to generate a similar number by calculating Desktop * Idle * Shutdown * Launch * Launch * Search * 10.
• Desktop: The amount of time it took for Windows to boot and display the desktop icons. Sampled and averaged 7 times for each defragmenter.
• Idle: The amount of time it took for Windows to boot and fully load all the startup programs and services. Also a 7 sample average.
• Shutdown: The amount of time it took for Windows to shutdown, measuring from when I hit the [R] key (restart) on the shutdown dialog to when the screen went black. Another 7 sample average.
• Launch: The average speed that each test program launched compared to the original system. This average is made of 12 programs launch speeds (not times).
• Search: The amount of time it took Disk Explorer (another program I wrote, Windows Explorer searches are too silly slow) to do an offline (booted from another system) file search of the HDD for files with "readme" in the name. This test measures the efficiency of the rapid seeking between the MFT and B-tree structures as well as the recursive performance of the B-tree structures. Any time a file is opened, the NTFS driver has to find it; and the prefetcher only helps out for the first 10 seconds of a program's launch cycle. The better this number, the more snappy the computer will be, the faster your searches will go, the faster folders will open, and the faster certain programs (especially games with lots of little resouce files) will load. Incidentally, after performing the search once, if I repeated the search, it would take about 2 seconds, regardless of the disk layout. The marvelous Windows file cache in operation! Of course, that would degrade after the cache got purged; working on something else for a while would do this normally.
• Dfrg time: The amount of time the defragmenter being tested took to defragment C:\, starting when I clicked the [Defrag] button and ending when the program finished. I won't say that faster is better, because the faster defragmenters generally didn't do a very good job.
• File frags: The total number of fragments in all files after defragmentation, as measured by my defragmenter.
• Free frags: The number of areas with free space. The more of these there are, the higher the chance of fragmentation reoccurring quickly. Notice how Auslogics, Smart Defrag, and UltraDefrag greatly increased this number. Those programs will be addictive, requiring frequent defragmentation just to maintain performance; otherwise, massive fragmentation will occur over time, reducing performance and causing the phenomenon mentioned next.
• Deep frags: The number of extra MFT records needed to store all the file "extent" information for extremely fragmented files. Unfortunately, once they're created, they will never be removed until those file(s) are deleted. Notice how several of the better defragmenters fragmented certain files so badly during defragmentation that the number has noticeably increased over the original system. These additional records will impose one more disk read per record, so they are not a good thing.
Of course, I had to include the silly (but strangely popular on the Internet) "clear the prefetch folder" tweak just to honestly show how much performance you can gain by clearing out the prefetch folder.
FYI, I did not include my experimental defragmenter in these charts because it's not really ready for mass use and I didn't want to be clobbered by a bunch of requests for it. It did make #1 position on the charts, though.
I hope that you all find these charts informative!