I'm in the middle of generating a 3 TB drive image with GNU ddrescue
, and realizing it will be too big for the target drive (since the empty space is filled with 0xAA instead of 0x00, so instead of using a sparse output file, I need actual compression).
The output file is on a btrfs filesystem, which supports per-file compression, but by which method?
To apply compression to existing files, use the
btrfs filesystem defragment -calg
command, wherealg
is eitherzlib
,lzo
orzstd
.
For example, in order to re-compress the whole file system with zstd,
run the following command:# btrfs filesystem defragment -r -v -czstd /
This re-compresses existing files, but seems to be for folders of files, not individual files?
- It also says:
Tip: Compression can also be enabled per-file without
using thecompress
mount option; to do so apply
chattr +c
to the file. When applied to directories, it
will cause new files to be automatically compressed as they come.
It's not clear if that will re-compress existing files, though.
Setting the compression property on a file using
btrfs property set <file> compression <zlib|lzo|zstd>
will
force compression to be used on that file using the specified
algorithm.
This seems like it's for forcing compression of files that would not normally be compressed because they contain data that isn't very compressible?
Which command do I want to convert the entire existing file to (default LZO) compression, and to continue compressing any more data that is written to it in the future, without changing the compression of other files on the volume?
Best Answer
It also works for individual files.
(For directories you need the
-r
flag, else it'll defragment the metadata of the subvolume the directory belongs to).It will not.
It disables the mechanism that detects whether or not a file is compressible and compresses it regardless.
I believe it will only store the file compressed if it's at least somewhat compressible though, more specifically if the compressed data is not larger than the original IIRC.
Also note that that detection mechanism isn't very sophisticated and disables compression on very compressible files too sometimes.
(Note: This will force compression on the file)