Creating and splitting large multipage TIFF images

image manipulationimagemagicktiff

I need to both create and split multipage TIFF images, ranging from 2 to almost 100 pages (A4, 300 dpi, 2500×3500 px). The job is performed periodically by a script on an x64 Linux server. Currently I'm using Imagemagick. The smaller cases do not pose any problems, but the larger ones do.

I need to radically reduce amount of memory used during the operation.

For example, this:

convert *.jpg -compress lzw output.tif

(70 jpeg files) consumes about 4.6 GB of RAM, even though each input is less than 2MB the resulting file is less than 250MB.

The reverse operation:

convert input.tif output-%04d.png

has similar issues.

From what I have read, this happens because Imagemagick first loads and decodes all the input images and only after that it starts encoding them into the output file.

How can I create and split multipage TIFF images without such huge memory footprint? I don't have to necessarily use ImageMagick, any other free tool will be fine.

Best Answer

I had the same problem today while trying to split a 1700 image, 1G tif file. 16G of memory wasn't enough, then tried having it cache on disk, but that was slow and it easily exhausted more than 100G on the hard drive without accomplishing anything (this was probably a bug).

But apparently ImageMagick can extract a specific tif from the original file without loading it completely, so was able to split the bigger file with a simple bash script:

subfiles=$(identify -quiet -format '%n\n' largefile.tif | head -n1)
for (( i = 0; i < subfiles; i++ )); do
    convert largefile.tif[$i] -scene 1 split/smallerfile_$i.tif
done

No idea though how to create a big file without running out of memory, so maybe this is half an answer?

Related Question