Windows – Avoiding extreme fragmentation of compressed system images on NTFS

compressiondefragmentimagesntfswindows

Problem explanation

I'm storing windows disk images created with wbadmin on NTFS drive, and I found compressing then with NTFS compression gives 1.5-2× space conservation, still giving full availability for restoring.

But in process of compressing, file get insanely fragmented, usually above 100'000 fragments for system disk image.

With such fragmentation, defragmenting takes very long (multiple hours per image). Some defragmenters even can't handle it, they just skip the file or crash.

The source of the problem is, I think, that file is compressed by chunks which get saved separately.

The question

Is there good (fast) way to get image file defragmented yet keep it compressed (or compress it without causing extreme fragmentation)? May it be some utility to quickly defragment file to continous free space, or some utility (or method) to create non-fragmented compressed file from existing non-compressed?

Remarks based on comments/answers:

  1. External (to windows kernel) compression tools are not an option in my case. They can't decompress file on-the-fly (to decompress 10 Gb file I need 10 Gb free, which isn't always at hand; also, it takes a lot of time); they're not accessible when system is boot from DVD for recovery (it's exactly when I need the image available). Please, stop offering them unless they create transaprently compressed file on ntfs, like compact.exe.

  2. NTFS compression is not that bad for system images. It's rather good except for fragmentation. And decompression does not take much CPU time, still reducing IO bottleneck, which gives performance boost in appropriate cases (non-fragmented compressed file with significant ratio).

  3. Defragmentation utilities defragment files without any regard if they are compressed. The only problem is number of fragments, which causes defragmentation failure no matter if fragmented file compressed or not. If number of fragments isn't high (about 10000 is already ok), compressed file will be defragmented, and stay compressed and intact.

  4. NTFS compression ratio can be good, depending on files. System images are usually compressed to at most 70% of their original size.

    Pair of screenshots for those do not believe, but ofc, you can make your own tests.

  5. I actually did restorations from NTFS-compressed images, both fragmented and non-fragmented, it works, please either trust me or just check it yourself. rem: as I found around year ago, it does not work in Windows 8.1. It sill works in Windows 7, 8, and 10.

Expected answer:

an working method or an program for Windows to either:

  1. compress file (with NTFS compression, and keep it accessible to Windows Recovery) without creating a lot of fragments (maybe to another partition or make a compressed copy; it must be at least 3x faster on HDD than compact + defrag),

    or

  2. to quickly (at least 3x faster than windows defrag on HDD) defragment devastately fragmented file, like one containing 100K+ fragments (it must stay compressed after defrag).

Best Answer

Avoiding fragmentation

The secret is to not write uncompressed files on the disk to begin with.

Indeed, after you compress an already existing large file it will become horrendously fragmented due to the nature of the NTFS in-place compression algorithm.

Instead, you can avoid this drawback altogether by making OS compress a file's content on-the-fly, before writing it to the disk. This way compressed files will be written to the disk as any normal files - without unintentional gaps. For this purpose you need to create a compressed folder. (The same way you mark files to be compressed, you can mark folders to be compressed.) Afterwards, all files written to that folder will be compressed on the fly (i.e. written as streams of compressed blocks). Files compressed this way can still end up being somewhat fragmented, but it will be a far cry from the mess that in-place NTFS compression creates.

Example

NTFS compressed 232Mb system image to 125Mb:

  • In-place compression created whopping 2680 fragments!
  • On-the-fly compression created 19 fragments.

Defragmentation

It's true that NTFS compressed files can pose a problem to some defragment tools. For example, a tool I normally use can't efficiently handle them - it slows down to a crawl. Fret not, the old trusty Contig from Sysinternals does the job of defragmenting NTFS compressed files quickly and effortlessly!

Related Question