Realities of Hard Drive Fragmentation

We have all heard that defragmenting a hard drive (HDD) can give you a boost in performance. Is this just a an old wive’s tale or is it grounded in reality?

Most of us, whether we are top-notch IT talent or just a regular Joe or somewhere in between know that taking care of a computer and doing some basic upkeep can keep it running at a decent clip.  Defragmenting a HDD is one of those basics that we should all do and has been so since well, mechanical HDDs came into existence.  However, is this just good housekeeping practice based on actual results, or is it just one of those myths about what to do with your computer to keep it running well?

The Test Setup

To set this test up and to keep it fair, I am using VirtualBox. The image I built is based off of Windows XP SP3 in a 14.9gb image.  I downloaded and installed all possible windows updates (critical or not) from Microsoft as of 02/08/2010 .  I then installed MS Office 2007 and downloaded all the updates for it.  I then uninstalled office and the updates for office.   I then copied many photos to and from the drive.  I then removed every other photo (over 2,000 photos), then installed Microsoft Office 2007 and downloaded the updates for it again.  I then deleted all of the photos.  I then used a program called Fragger to purposely fragment the rest of the drive (fragger wrote 5,706mb of pure fragmentation to the image).  I then installed all of the softwares to be used in this test (the defragmenters) so they would be preinstalled on the backup image.  I then used the function built into VirtualBox to make a backup image of the virtual machine so that I could restore from that backup for the purpose that each defragmenter has a fair chance on the same exact fragmented dataset.

To start things off, I needed to find some sort of baseline for performance.  After doing much searching, I settled on a free tool called HDTune.  It isn’t as full featured as the full pay-version of the program, but for the purposes of these tests, it was more than enough.  It gave me 4 readings that I felt were the most relevant to the tests I was doing: Minimum Read Speed, Maximum Read Speed, Average Read Speed, and Access Time.

I did not measure defragmentation time as I started each of them defragmenting before i went to bed each night, so time didn’t matter to me.  I would like to go ahead and say that I feel it is safe to assume most others do something similar.  With defragmentation being something that should be done at least, in my opinion, once a month, it will tend to be something that is set to be done while the pc is not in use, like during the night while you are sleeping, thus why I tested in this manner.

I then found myself ready to run the tests.  Then the bright idea occurred to me: Maybe Virtual Box will have a massive impact on the tests, so I need to have a comparison of my HDDs read speed outside of Virtual Box and inside of it.  So, I ran those tests.  I then decided that once I ran all the tests, I would use the winner of the tests and defragment my harddrive (previously defragmented with MyDefrag before building the Virtual Box image) to see if it improved things any, especially after deleting and restoring multiple 15gb images in the process of the tests.

The version of Virtual Box used in this test was Virtual Box 3.0.12r54655.  The HDD that I am using and got the original base results to compare against is a ST3250410AS with Windows XP installed on it.  Also, where needed, I will note what options I used with each defragmenter in an attempt to get the ‘most’ out of the program.

The Defraggers

The Results Inside of Virtual Box

The first thing I needed to do inside of Virtual Box was to get a baseline reading that I could compare against.  The first reading I took was just after installing Windows XP SP3 and all of the updates – critical or not – available at the time.  This would represent the fresh system install speed that we all know and love so well.  The results are as follows:

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
0.7 58.6 39.2 25.6

I then needed a baseline test after all the work I put into purposely fragmenting the image, the results of which are below. What was surprising to me was how far the average read speed fell.

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
4.9 57.7 31 13

Now, for a baseline defragmentation, and the standard against which all the other defragmenters in this roundup will be measured, I ran the built in defragmenter that comes standard with WindowsXP.  The results that came back were very surprising, which when I ran this, gave me great hope for what the others might be able to accomplish as they are all touted as ‘better’ than the default built into windows.

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
5.7 59 50.6 7.5

3rd Party Defraggers

Now, with that as a nice set of baselines to compare against, I then went on with the rest of the tests.  After every time that I restored the image, I would then defrag the image, reboot it, then run the tests.  The next results are of my defragger of choice (before i ran these tests).  After the tests, will it still remain my favorite?  We shall see.  The next results are for MyDefrag.  For options, I chose Defrag Monthly for the most thorough defrag.

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
22.4 204.8 67.9 8.4

These results had me excited.  That big of a boost over the default defragger aside from Access Time (which would prove to be the only thing that the XP Defragger was better at in the end, to only slightly spoil the results).   However, with these results in, it was now on to the next defragger: Defraggler by Piriform.  For options, I chose – Move Large Files to End of Drive.

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
23.5 206.6 68 8.2

The next candidate up for testing is one that I found quite by accident while looking over slickdeals.net for some deals on Steam.  Puran Defrag is a program that used to be payed-for software that is now freeware.  I decided to give it a chance in this roundup to see if a ‘payed for’ style program was worth the money it used to cost.  The option I chose on this one was to use the setting: Boost overall system speed by Puran Intelligent Optimize

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
1.3 207 50.9 8.6

Then, the next test was on to a program I had used in the past, then decided to stop using once I found MyDefrag (formerly JKDefrag) as it didn’t feel as fast after defrags.  Would this test bear that out?  The options I chose for this one were: remove temp files before defragmenting, move system files to beginning of disk, defrag and optimize

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
5.4 206.8 58.9 9.0

Then, the final program in our roundup was suggested to me by fellow forum member C627627.  Vopt was a program I had never heard of, but since it was a free download, I decided to add it to the test and to see how effective it was.  The options I chose that were available to me in this one were: tighter packing, move system restore to end of drive.

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
19.7 202.7 57.9 9.3

The Results Outside of Virtual Box

Before doing the battery of tests that I did, and knowing what would happen to my HDD after deleting and restoring a 15gb image file over and over, I ran a preliminary speed test on my HDD to see what the transfer rate was outside of Virtual Box so that I could see what the affects of running the tests inside of Virtual Box would be, if any affect.  Those results are below.

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
4.7 81.8 73 15.3

I then decided that for a final bit to test my drive to see if the overall results winner actually improved anything, I used Defraggler with the same options as in the Virtual Box test, then ran HDTune one last time.

Minimum Speed (Mb/s) Maximum Speed (Mb/s) Average Speed (Mb/s) Average Access Time (ms)
52.1 81.8 74 15.2

So, even though the improvements were minimal aside from minimum speed, it still improved over even the MyDefrag results before all the other tests.

Results Summary

Minimum Read Speed (Higher is better)

Minimum Read Speed (Higher is better)

Maximum Read Speed (Higher is better)

Maximum Read Speed (Higher is better)

Average Read Speed (Higher is better)

Average Read Speed (Higher is better)

Access Time (Lower is better)

Access Time (Lower is better)

Conclusions

With the results of the tests in, it seems the ‘old wive’s tale’ of defragmenting your HDD to maintain performance is not only true, the results of it were astounding.  Even if you don’t use a 3rd party defragger, the tool built into Windows itself gives a massive boost in performance.

I do realize that these results are only for my machine and specific for only a  virtual machine inside of Virtual Box.  Even as that may be, I feel there are two 3rd party defraggers I can confidently recommend to anyone to use on their machines as their results are so similar, and the overall increase in speed is different by only 0.1Mb/s.

So, congratulations are in order for MyDefrag 4.2.7 and Piriform Defraggler 1.16.165!

- TollhouseFrank

Tags: , , , , , , , , , ,

69 Comments:

petteyg359's Avatar
Perhaps the graphs should be updated. b generally refers to bit, rather than B for Byte, and a 39 megabit read speed is pretty slow Also, any chance you could run the test with this defragger?
TollhouseFrank's Avatar
LOL. probably is wrong. I always forget which is which. anyways, soon as I get home from work, i'll check and see if i still have the fully fragged image (i should) and if so, i'll give your defragger a shot and report the findings for ya.

*edit*

Just as an update. The speeds are in Megabytes, not Megabits. If it was Megabits, then i would still be defragging the image...
Neuromancer's Avatar
Proof is in the pudding as they say

I like O&O defrag utility. It is not free but it is REALLY fast with lots of options for defragging. Based on stealth (not fast but you can use the PC while it runs) to access, (which puts all the most commonly accessed files at the beginning of a drive)
TollhouseFrank's Avatar
everything i used was free, and puran defrag USED to be a pay-for program... but is free now.

And you are right. Proof is in the pudding. I personally didn't think there would be such a gap between the built-in default defragger in windows vs. the others. I figured they would be a little faster, not a lot faster as they all seemed to turn out, except for puran which compared to the default defragger, is a wash.
hokiealumnus's Avatar
Great job on the article Frank. A lot of work went into this folks, well worth a read if you haven't already!
TollhouseFrank's Avatar
thanks hokie!

I'm hoping this article will help out a lot of people, not only here, but all across the spectrum of the IT world, from regular Joes to SysAdmins and everyone in between.
petteyg359's Avatar
I know they're in MegaBytes, which is why I suggested you change the labels on the tables There's a difference between Mb and MB.
Aynjell's Avatar
I am curious how good the Windows 7 defragger is. It seems to take ages, so I'm really curious how well it optimizes the disk.
Chixofnix's Avatar
Thanks Frank - very enlightening topic and well-presented!
glorp's Avatar
Excellent information. If you have the total run times of all the defrag runs of the different utilities it would be useful to include those too.

I've used MyDegfrag for some time now and I can say for certain that the Monthly defrag strategy is very complete but it can also take forever to run. I use a modified daily/weekly strategy myself.
Jolly-Swagman's Avatar
Nice Article and a great Read - thanks frank.
TollhouseFrank's Avatar
these results are for 1-and-done defrags. Some of them, once the initial defrag is run, run much faster the next time.

For example, Defraggler and MyDefrag both took close to 5 1/2 hours to finish... and from my prior experience with both, the next run is 'near instant'... like, 5-10 minutes.

However, this test wasn't designed to test how fast the job got done, just how fast the results were once it got done.

Thanks for the comments guys!
glorp's Avatar
Not if you wait another month to do it again.
TollhouseFrank's Avatar
just checked. i do have the fully fragged backup image still. if there are additional defraggers you want tested and as long as i can download/use them for free (even if it is a reduced, demo mode), then get me the link and i'll test them and post them up for ya.
[AK]Zip's Avatar
Diskkeeper!

You can test it using a 30-45 day free trial.

Edit: http://www.diskeeper.com/
corruption's Avatar
Perfect Disk is what I've used for years, always had good results with it. I'd be interested in seeing how it stacks up to the others in your test.

You can get a trial from their site here.
TollhouseFrank's Avatar
Defrag Express Settings:
Background File Placement Optimization (Boot Optimize) disabled - The Program suggests doing this so I'm doing this one.
Thorough - Fast Consolidate, Move Directories close to MFT, Move frequently used files to high performance area.

Results:
Minimum Speed - 1.8MB/sec
Maximum Speed - 200.1MB/sec
Average Speed - 64.3MB/sec
Access Time - 18ms
TollhouseFrank's Avatar
Diskeeper 2010 settings:
None. Left at defaults from install.

Results:
Minimum Speed - 0.8/sec
Maximum Speed - 202.3/sec
Average Speed - 58.2/sec
Access Time - 18.6ms
jonspd's Avatar
glad to see it up
DonnEdwards's Avatar
I hate to rain on your parade, but using HDTune to measure performance is pointless, since it completely ignores the layout of the files, fragmented or not, and just does a direct read from the drive. This test can be run on a drive that has not even been partitioned or formatted.

So, unfortunately, your test results prove nothing other than the margin of error involved in measuring a hard drive performance when there are other system processes running.

I have not used VirtualBox, and you have not explained how Virtualbox maps its image sectors to the drive image it created; or whether the 15GB image, once restored in each case, was fragmented or not.

HDTune was not designed to be run inside a virtual environment, so I don't see how you could rely on it for accurate measurements.

Your tests need to read the speed with which a FILE can be read, not a hard drive sector. HDTune reads sectors.
DonnEdwards's Avatar
One other important omission: PerfectDisk has for some time been touting its ability to do a defrag INSIDE a virtual environment. I'm not sure if VirtualBox is supported, but you should at least have tried it out.
Aynjell's Avatar
I agree, that this is an important part of what would have been considered here. It's a low level benchmark that ignores the filesystem near as i can tell.

Any test that actually reads files, writes files etc using a batch script or some other type of benchmark would have been better. PCMark tests the disc, doesn't it?
visbits's Avatar
Ugg... This makes he hate hard drives and platters more.


SSD > Life
I.M.O.G.'s Avatar
Donn, thanks for your input, its clearly helpful to draw these concerns out.

I haven't personally investigated this, but trusting your word as I don't have time to verify right now, let's assume we should use something else. Do you suggest a more appropriate tool?

Also, everything considered here is free, does perfectdisk offer a free version? (EDIT: It appears a 30day trial is available, I'm not familiar if any of the items considered in this article are trialware)

The great part about having community input is that we get good review of our articles. Hashing it out and making the corrections is not a problem - our goal is to relay good information. Overclockers has always been about free speech and publishing reader and member generated content, so anyone can respond in comments or in article form to keep everyone on their toes.
S3CUR3's Avatar
Excellent post! I learned a great deal.
TollhouseFrank's Avatar
Ah. Seems there is something I missed by using HDTune. I am more than open to anything that improves my testing methods. I still have the base testing image ready to go. Of course, it will take a full night per defragger to re-run the tests... but anything worth doing is worth taking the time to do it right.
petteyg359's Avatar
Open-source filesystem benchmark: IOzone

Might provide better results than HDtune.
Mr Alpha's Avatar
It might not be as bad as that. If I understood your methodology correctly then you ran HDTune inside the VM. If this is the case then it doesn't have access to the underlying hard drive, only to the virtual hard drive. And since the virtual drive is a file on the real drive, then the fragmentation of the virtual-drive-file would have an impact on the results of HDTune.

What I am not sure of it how the fragmentation inside the virtual drive relates to the fragmentation of the virtual-drive-file and what exactly the defragmenters do when they defragment inside a VM.
TollhouseFrank's Avatar
right now.... all I can say is there is work going on behind the scenes to see if there is a way to improve the tests. I am more than willing to scrap these results if there is a better method to do the tests and get what I want: which is to prove if defragmenting/optimizing has an affect on raw read speed.

Just so any future readers know, I know that access time is a crucial part of defragmentation. However, i felt that a few milliseconds would be unnoticed compared to how fast it takes to read the file once it is found on the drive.

But again, if my methods are flawed, or if they are good but the tests are flawed, I am willing to accept that I made a big boo-boo and redo the tests after a method that can be agreed upon by me and a few others behind-the scenes can be found.

I don't feel it would ruin anything to say that Puran already contacted me with suggestions to improve my methods, to which I replied with some questions of my own. Even if this takes a while before I can retest, I will do the tests again if it is decided there is a better method to do this.
TollhouseFrank's Avatar

This may be true. I am not quite sure how it does so, but I know the image is 3 parts. 1 part is a config file (just a few kb in size), the other i'm not sure what it is, but it's just a few kb is size, and then the image file which is 14.9gb in size, as it's own seperate file on the drive.
HunterZ's Avatar
Thanks for the article.

I am curious to see how UltraDefrag does in this test, since I keep hearing people mention it lately due to it being open source: http://ultradefrag.sourceforge.net/

Issues with the testing:

- The test setup seems a artificial, but was definitely a good and well-documented attempt to at least set up *something* useful.

- I don't see how deleting and restoring one large file over and over is going to have any effect on fragmentation of other files.

I've been using MyDefrag, but it's slow and doesn't do boot-time defragging. Maybe Defraggler or UltraDefrag would be a better choice.
I.M.O.G.'s Avatar
Fragmentation was achieved effectively through a tool designed for this very purpose:
http://software.bootblock.co.uk/?id=fragger
HunterZ's Avatar
My second point was referring to the last section of the article (testing Defraggler outside of VirtualBox), which didn't mention use of that tool.
TollhouseFrank's Avatar
What i noticed happening to my HD as i kept deleting the images and restoring it was the system reserve files ended up getting thrashed. I believe that was where most of my fragmentation on my 250gb main drive happened.

these tests were in an artificial environment. This was designed as semi-theorycrafting and mostly to see which defragger 'put humpty dumpty back together again'.

Currently am researching other tools to test HD performance to see if I can get more useful data and update the article.
EarthDog's Avatar
Definately a great article!!! Good work!

One thing I would personally like to see is what are the real life benefits of this? The easiest to test would be boot times. Another idea that came to mind are file transfers before and after.
HunterZ's Avatar
Definitely appreciate the effort, it's an interesting read.
TollhouseFrank's Avatar
You see? This is the sort of thing that should have occurred to me to test, yet did not! Thanks for the question Earthdog! In the next round of tests, these will definitely be added to the pool of tests. I guess the big thing on file transfer though, is how to properly do it. Do you mean copying a file from the drive back to the drive? Do you mean through the virtual network link from my machine to the VM? Or do you mean file transfer across the network?
EarthDog's Avatar
The file transfer is going to be a tough one to measure consistently...Im not sure how to do it honestly. I would say have a huge movie or something, but presumably that would be contiguous and may not get defragemented...I dont know. I would say drive to drive would give you the best results though. With the network and VM's involved, I would imagine that would play a role in anomalous results with their overhead and how they function in general.

If I were in your shoes, I would look to see, somehow, how those things are tested and apply that.
petteyg359's Avatar
IOzone will do that type of testing for you, although reading more it may be overkill. Not seeing many similar tools that are free, though.
TollhouseFrank's Avatar
i'm getting ready to relax for the night.

Thanks Pettey. I will definitely look into that this weekend after my daughter goes back to her mother (it's my weekend with my daughter and i plan to enjoy every second of it with her).

I believe that in order to do this again, I need to rebuild the fragmented image to include the other defraggers listed in response to this, plus the IOZone software, i'll also leave in HDTune just to see if it really is showing the true speed difference, and I'm looking for one more benchmarking software for this, though i'm leaning towards PCMark, i'm not sure which version i should use (one of the free versions, obviously) that would measure what i'm looking for, raw speed before and after.
Mr Alpha's Avatar
File-Copy Test might be worth looking at.
zzzzzzzzzz's Avatar
There are two (not one) different disk defragmentation software programs that come with Windows XP:
The Disk Defragmenter snap-in (Dedfrg.msc)
defrag.exe
You should clarify which "built in" defragmentation software was used.

It is also important to know what file system was defragmentation attempted on.

The above herein this post notwithstanding, the, not so unfamiliar, latency (access time)/throughput (average read speed) correlation seems to exist. (That is, latency can be sacrificed (increased) to acquire greater throughput; throughput can be sacrificed (decreased) to acquire lesser latency)
AlucardCasull's Avatar
MyDegrag is now 4.2.8, any chance of running it and seeing if they improved?

Thanks for the article, gave me the motivation to defrag!
TollhouseFrank's Avatar
two different commands that lead to the same exact program as far as i can tell, which honestly, looks like a ripoff of diskeeper from about the time that XP came out.

However, just for the sake of making sure I don't miss anything, I'll use both of those in the next round of tests.
I.M.O.G.'s Avatar
Those utilities are identical, just 2 different ways of getting to the same thing.

I haven't read it yet since it's blocked by websense on my current network, but Donn who posted earlier has some material that may be helpful, though his goals with his article were different:
http://donnedwards.openaccess.co.za/...t-winners.html
TollhouseFrank's Avatar
i'm gonna let this sit for the weekend. I need to unwind and destress from the week at work. Starting monday, I'll used some spare hardware here at the shop to build a machine specifically for these tests, and to build an image of said machine once I'm done so that I can get results that are outside of VirtualBox, which ought to get rid of about 50% of what most readers have led me to believe they think affects my tests the most.

The other 50% is finding software that will measure what I'm trying to measure and testing it to see if it works the way I want it to work.
I.M.O.G.'s Avatar
I like that plan Frank, and I'm looking forward to your results.
TollhouseFrank's Avatar
thanks IMOG. I got a spare 80gb drive here at the shop that I can throw a shop copy of XP onto and use the bench machine to get it setup to test this again. Once I have it ready to go, I just need to find something I can use to make backup images of the drive so that I can restore it over and over.

It's one thing to make a backup of a VM. It's another to do a backup of a whole OS disk. I need to make sure that whatever I use to do the backups is free and doesn't mess with the dataset.
petteyg359's Avatar
Using a Linux LiveCD, you could dd if=/dev/sda of=/dev/sdb bs=4096 (assuming the source drive is at /dev/sda, and you're copying to another drive, rather than a file). You could also use /dev/sda1 and /dev/sdb1 to copy just the partition (again, assuming Windows partition is /dev/sda1, and you've partitioned /dev/sdb with the same partititions), rather than the whole disk, or use a file in of= to store it to a file rather than another disk. To write it back to disk, just reverse the if and of parameters. bs=4096 tells it to copy 4K at a time.
TollhouseFrank's Avatar
before i do that pettey, I need to get the image built again, and have a way that i am 100% sure of to back it up and restore it so that each defragger gets the same chance with the same dataset.
petteyg359's Avatar
That's the point of dd. Copy the disk, not the files.
bcsizemo's Avatar
I hope I don't point out something someone else has already brought up, but here goes...

If you are using a VM to test with you are essentially using a single file as the virtual file system. This in effect would be similar to short stroking a drive. So if that VM file is 15 gigs on an 80 gig drive then you are going to be limited to the a theoretical max transfer speed depending on where that 15 gig file resides on the 80 gig drive... (and if you are deleting and coping it back every time you start a new test that might cause issues with fragmentation of the file itself...)

I would think in order for the test to be valid you could not use it inside a VM. Or at best if you did, you would be testing the impact of fragmentation on the VM itself. (And either way this is what needs to happen in order to form the baseline image: )

Like several others have said you have to copy the bits. Or more exactly you need to sector copy the initial image. This would give you a consistent starting point to measure from each time. If you simply copies the VM file back to the drive you may end up with file fragmentation. Or if you made an image of the system it would probably defrag itself during the creation/restore process. (I know Ghost used to work like that for a standard disc clone.) I would also install all of the test programs first to minimize changes to the initial image as much as possible.

If we going more for a copy/move/creation type of test setup, having a second drive as the frag drive might be a good idea. Instead of using/copying a whole OS, just have a secondary data drive be the fragmented one. While it wouldn't give you as tactile results as restoring the OS each time it would be somewhat easier.

Don't want to rain on any parades, but just wanted to point out that you could have fragmentation inside fragmentation....
zzzzzzzzzz's Avatar
This should not matter when the results are used only for relative comparisons. However, due to rounding or truncating decimal values, the accuracy of the relative comparisons might be decreased.
This is something I quite dislike when working with a virtualization software VMWare Workstation.

When using a file to represent a storage device, much time is spent . When the represented information becomes fragmented, it often is necessary to defragment three times to defragment a virtual machine and its represented information in a timely manner: first the container file is defragmented, then the represented information (operating system, virtual files, etc.) are defragmented, then the container file is defragmented again (after becoming fragmented by the defragmentation of the virtual content).

A VMWare virtual machine may be set to use a physical disc instead of a file to store the virtual content.

I have never used nor have examined VirtualBox.
I.M.O.G.'s Avatar
If you are using a virtual disk for your VM, you should allocate all required space on the host for the virtual disk from the start. The situation you propose could only occur if disk space is not preallocated on the host. Defragmenting a virtual disk within vmware is actually only possible if your using a growable disk.

Assuming a properly configured virtual disk on the host, there are 2 possible fragmentation scenarious when using a VM with a virtualized disk. This file can be fragmented on the host, which can be resolved by defragmenting the host system. The guest can experience fragmentation within this file, which is resolved by defragmenting from within the guest.
zzzzzzzzzz's Avatar
I use fixed sized disks. You had described

There are three levels of fragmentation that may affect a virtual guest file system for VMWare virtual disks:

Fragmentation of the host system file system (including files to store virtual machine information). This fragmentation may be defragmented on the host system by using a defragmentation tool.

Fragmentation of the content of host file storing a guest (virtual) system. Defragmenting the contents of the host file may be performed using a utility provided by the virtualization software vendor/owner (may be incorporated into main virtualization software).

Fragmentation of the guest (virtual) system file system. This fragmentation may be defragmented by deframenting the guest (virtual) system file system from the running guest system. Defragmenting a guest file system fragments the host file storing the guest (virtual) system.

It is possible for the virtual (guest) system host file to be unfragmented on the host system, but fragmented within.
I.M.O.G.'s Avatar
That is true for sparse/growable disks, but not for preallocated disks.

You cannot defragment a virtualdisk file on the host, unless you are using growable/sparse disks on vmware.

This document reflects the idea you have:
http://www.vmware.com/support/ws55/d...sk_defrag.html

What it does not state, is that you cannot defragment a preallocated virtual disk file:
Source: http://www.vmware.com/support/ws55/d..._examples.html

Additionally, notice how this defragmentation tip for vmware focuses on growable disks (or sparse virtual disks, same thing, different terminology):
http://blogs.vmware.com/teamfusion/2...fragmenta.html

So if you are using a sparse virtual disk, which is default in vmware fusion, then your defragmentation is a 3 part process. The only advantage of using sparse virtual disk is that your VM's will take up less space on the host system - the tradeoff is non-optimal performance and more complicated defragmentation. For anyone else who is preallocating the disk space for the VM, a 2 step defragmentation on the host and guest is all that is required.
zzzzzzzzzz's Avatar
The attachment to this post shows settings I usually use when creating a VMWare virtual machine in VMWare Workstation 6.5.3.

I have noticed that Virtual Machine host file is defragmentable when the options are used.

I note that I do sometimes make virtual machine snapshots. This seems to add large files for which the size of the aggregate is less than the size of the original virtual host file(s). Perhaps this may be considered a "growth".
Joeteck's Avatar
You might want to look into using diskeeper 2010 as well. I swear by this form of De-fragmenting. But for a free solution I use Defraggler as well.. very Slow , but it works..

I never knew that there was a myth about defragging. It has always been proven since windows 98, that defragging speeds up file access and booting... Just wish there was a way to track it and test each file...for real world proof...
rk15000's Avatar
Very good write-up! I was amazed by the numbers that Piriform Defraggler posted, which is also the defrag utility that I use. An average read speed of 68 Mb/s is superb. That is almost on par with the numbers you posted outside of virtual box. Interesting.

If anyone wants to pick up defraggler, you can get it here
kundalini's Avatar
Hi Frank

Well done - thanks for being so generous with your time & energy to share the results of your defrag testing

I appreciate that the methodology is still being tweaked & the need for that discussion - but thought this summary of the results of your tests might be helpful (please let me know if anything needs correcting here):

Defraggers initially tested (reported 18/2/2010):

* Windows XP Default Defragmenter
* MyDefrag 4.2.7
* Defraggler 1.16.165
* Puran Defrag Free 7.0
* Auslogics Defrag 3.1.2.90
* Vopt 9.21

Best overall results: MyDefrag 4.2.7 & Piriform Defraggler 1.16.165


Also tested:

* Defrag Express
* Diskeeper 2010
(if I understood the results correctly MyDefrag & Defraggler still rate best)


Would there be any significant difference in results between MyDefrag 4.2.7 & MyDefragGUI 2.1 ?

----

Now here are a list of defrag wares that would be worth testing (though some may be trial versions - not sure if all of these are freeware):

* Auslogics Disk Defrag (been getting a good rap on other forums)
* IObit Smart Defrag (quite popular so worth testing)
* Vista & Windows 7 Defrag (goes without saying)
* O&O Defrag (requested by other members in thread)
* Perfect Disk (requested by other members in thread)
* UltraDefrag (requested by other members in thread)

If anyone out there knows of any other worthy freeware defraggers then please advise!
kundalini's Avatar
By the way I've used IObit Smart Defrag every few months, now doing full Vista defrag on my notebook (with no other apps running) & it's already taken over 24hrs & still running does anybody know if that's possible without there being some kind of problem?
My spec's are: HP Pavilion DV9520TX with Intel C2D T7500 2.2GHz, 4GB RAM, Vista OS & all wares on 160GB HDD & all music & docs on 500GB HDD
kundalini's Avatar
Hi Frank

Well done - thanks for being so generous with your time & energy to share the results of your defrag testing

I appreciate that the methodology is still being tweaked & the need for that discussion - but thought this summary of the results of your tests might be helpful (please let me know if anything needs correcting here):

Defraggers initially tested (reported 18/2/2010):

* Windows XP
Default Defragmenter
* MyDefrag 4.2.7
* Defraggler 1.16.165
* Puran Defrag Free 7.0
* Auslogics Defrag 3.1.2.90
* Vopt 9.21

Best overall results: MyDefrag 4.2.7 & Piriform Defraggler 1.16.165


Also tested:

* Defrag Express
* Diskeeper 2010
(if I understood the results correctly MyDefrag & Defraggler still rate best)


Would there be any significant difference in results between MyDefrag 4.2.7 & MyDefragGUI 2.1 ?

----

Now here are a list of defrag wares that would be worth testing (though some may be trial versions
- not sure if all of these are freeware):

* Auslogics Disk Defrag (been getting a good rap on other forums)
* IObit Smart Defrag (quite popular so worth testing)
* Vista & Windows 7 Defrag (goes without saying)
* O&O Defrag (requested by other members in thread)
* Perfect Disk (requested by other members in thread)
* UltraDefrag (requested by other members in thread)

If anyone out there knows of any other worthy freeware defraggers then please advise!
I.M.O.G.'s Avatar
Might try sending Frank a PM, seems he hasn't followed up on this thread lately and maybe a poke will help - I know some real life things have been keeping him busy, but he is still around regularly.
kundalini's Avatar
Thanks - I just sent him an email - or would a PM have been better?
I.M.O.G.'s Avatar
Not sure, but that should work!
thinkpol's Avatar

Are these results still relevant?

Has the Windows 7 defrag utility been improved over XP?
Have any of these 3rd party defraggers been updated to increase performance?
I.M.O.G.'s Avatar
No, the tests actually would all need to be re-ran to test for issues readers reported with the testing. However, the author has been busy with other things and hasn't gotten back to designing or rerunning a new round of tests.

If anyone is interested in doing the legwork involved with that after submitting a methodology which we've reviewed and approved, we'd be happy to publish updated results.
axhed's Avatar
thanks for this! just dl'd defraggler and it's running in the background

i'm so ashamed... my os drive was at 26% and the data drive was at 38%. BAD NERD!
Leave a Comment