How to split file and save parts to multiple locations

backupfileshard-disksplit

How do I split a large file into multiple smaller chunks and write each part of it into separate locations?

split command seems to only output all the files into one location.

The context I need to do is the following: I have to backup a large hard disk by creating its compressed clone image. No external hard disk I have can fit the compressed image as a single piece. So I need some way to split and write the image into multiple locations.

For cloning and compressing the image, what I had in mind is the following.

dd if=/dev/sda conv=sync,noerror bs=64K | gzip -c | split -b 110g - <Multiple locations for each piece>

One option I have is to create a software RAID 0 partition using all the external hardisks connected together and write the compressed image into it. But it would be nice if a simpler solution exists (using build-in gnu/linux commands).

I could also dd only a small section of the large hardisk at a time, and repeatedly do it in a loop (using seek and count arguments of dd). But unless I compress it with gzip, I wouldn't know how much chunk can be fit into a single 110Gb external hardisk.

Best Answer

I think you can get away with using split's --filter=COMMAND.

... | split -b <SIZE> -d - part --filter=./split-filter

where ./split-filter is something like

#!/bin/bash

set -e

n="${FILE#part}"
case $((10#$n%3)) in
    0)
        dd bs=64K >"path1/$FILE"
        ;;
    1)
        dd bs=64K >"path2/$FILE"
        ;;
    2)
        dd bs=64K >"path3/$FILE"
        ;;
esac