Defrag Free space?

jimojimo

Distinguished
Dec 25, 2009
13
0
18,520
Sorry, not sure if this is a 'software' or hardware question, but this forum felt more appropriate. I would like to defragment the freespace on my hd in windows xp. The windows defrag tool does not do this. I am trying to defrag the page file with Sysinternal's PageDefrag, but there isn't enough contiguous free space. So I'm stuck. How do I do this? A free utility would be great, but I'm willing to pay a few bucks if there's a solid one that does the job.
Any ideas?
Thanks,
--Jim
 
Solution
Not sure if these defrag software would work for what you are trying to do, but you can try them and see:
AusLogic: http://www.auslogics.com/en/software/disk-defrag/
JK Defrag (now My Defrag): http://www.mydefrag.com/
Defraggler (the one I use now): http://www.piriform.com/defraggler

You may also try this, but I have not personally used it: http://www.softpedia.com/get/System/Hard-Disk-Utils/PerfectDisk.shtml

I am curious as to why you are trying to defrag the page file. Also, listing full specs would help.

Also try Contig: http://technet.microsoft.com/en-us/sysinternals/bb897428.aspx
What do you mean by not enough free space? How much space do you have free and what is the size of the page file?

Anyways, I have used PageDefrag and it offers no performance increases from what I have seen. Mind you, my page files are less than 2GB (since I have more than 4GB RAM on most of my systems) so depending on your page file size it may help to defrag the pagefile.

If you need more free space, try running CCleaner.
 

jimojimo

Distinguished
Dec 25, 2009
13
0
18,520
I have 25 Gig of "free space" but it's not "contiguous free space" pagedefrag needs a huge chunk (basically the size of the page file) of *contiguous* free space.

I run Windows defrag and it ignores free space (and doesn't really do much as far as defragging regular files either).

In the old days, Norton Utilities had a great defrag utility, but I think it stopped being usable after 2 Gig hard disks came around. The new defragger that Norton had to replace it with for > 2Gig disks was junk just like Windows' defrag is junk.
--Jim
 
Not sure if these defrag software would work for what you are trying to do, but you can try them and see:
AusLogic: http://www.auslogics.com/en/software/disk-defrag/
JK Defrag (now My Defrag): http://www.mydefrag.com/
Defraggler (the one I use now): http://www.piriform.com/defraggler

You may also try this, but I have not personally used it: http://www.softpedia.com/get/System/Hard-Disk-Utils/PerfectDisk.shtml

I am curious as to why you are trying to defrag the page file. Also, listing full specs would help.

Also try Contig: http://technet.microsoft.com/en-us/sysinternals/bb897428.aspx
 
Solution

jimojimo

Distinguished
Dec 25, 2009
13
0
18,520
Thanks Shadow, I will look at those.

I haven't done actual perf testing on the page file defrag, but one of the utilities said it was in about 3500 fragments. So my reason for wanting it defragges is that it just seems to me that of all files that one would want to be as efficiently read and written, the one that's supposed to mimick RAM would be at the top of my list. And with this thing in 3500 fragments, I just have this vision in my head of the disk head snapping back and forth when it pages, rather than sitting on a track and just sequentially stepping through the next tracks as it reads. Maybe disks these days are that physically predictable--like in the past when one could physically place the pagefile or other files of choice on the outer tracks, but that's just the vision in my head.

Anyway, I'm not as concerned if an .exe requires a couple of seeks to get the whole thing read for a one-time read, but on this particular machine (Dell laptop, core 2 duo 1.8ghz with 2 Gig ram), it hits the pagefile with regularity. I would prefer more RAM, but this is a company-issued one and that's the spec I'm stuck with for the laptop. Thankfully they stoked my desktop with 16 Gig so I don't have any worries on that box :)
--Jim
 
I would prefer more RAM, but this is a company-issued one and that's the spec I'm stuck with for the laptop.
I feel your pain man. It's painful trying to multi task on a PC/laptop (esp. netbooks) with just 2GB RAM.

As far as performance goes: Like I said, I have run PageDefrag and it really didn't make any noticeable performance gain every time I have run it.

Anyways, I do hope one of those utils works for you. Other wise you are probably SOL.
 

ricno

Distinguished
Apr 5, 2010
582
0
19,010


A possibility is to locate some of the largest other files on your drive and move them to some other unit, perhaps your home folder on your company network and then do a defrag.

Another possibility is to set the page file to very small, perhaps 100 MB, reboot and then defrag. Set it back to your wished size and it will most likely be in a much less fragmented parts. This should be the easiest way I think.

Also keeping the MIN and MAX size the same helps from getting more fragmentation.
 

sidewinderdt

Distinguished
Jan 16, 2009
35
0
18,540
The windows defragger isn't going to do much to consolidate the free space or to defrag system files. If the latter are fragmented, the free utilities won't be able to help either.

Your best bet is to get a free trial version of one of the advanced commercial utilites, leave it in auto defrag so that you can still use the PC while it defrags files + consolidates free space in the background, and finally run a boot-time defrag. These trial versions function just like the real thing for a few weeks, so that'll be plenty of time to optimize your disk(s).
 

ricno

Distinguished
Apr 5, 2010
582
0
19,010


Are you sure this is the case with modern Windows? Surely Windows NT/2000 would not be able to defrag system files, but I do think the newer versions can do it.

Anyway, I still suggest the OP to try out the two suggested methods above.

Even if the pagefile ends up somewhat fragmented it will not have any real performance disadvantage.
 

ricno

Distinguished
Apr 5, 2010
582
0
19,010


I belive it is because the access size for the pagefile is in the size of a "page", which is 4kB, that is quite small IO which is likely to hit inside one fragment of the pagefile. I also think that the pagefile normally is used much less than many belives.

However, having the pagefile into thousands of fragment will at least be unnecessary, so it will not hurt to try to fix it somewhat.
 

TRENDING THREADS