How to Create Multi Tar Archives for a Huge Folder

filesystemslarge fileslinuxtar

I have a large folder with 30M small files. I hope to backup the folder into 30 archives, each tar.gz file will have 1M files. The reason to split into multi archives is that to untar one single large archive will take month.. pipe tar to split also won't work because when untar the file, I have to cat all archives together.

Also, I hope not to mv each file to a new dir, because even ls is very painful for this huge folder.

Best Answer

I wrote this bash script to do it. It basically forms an array containing the names of the files to go into each tar, then starts tar in parallel on all of them. It might not be the most efficient way, but it will get the job done as you want. I can expect it to consume large amounts of memory though.

You will need to adjust the options in the start of the script. You might also want to change the tar options cvjf in the last line (like removing the verbose output v for performance or changing compression j to z, etc ...).

Script

#!/bin/bash

# User configuratoin
#===================
files=(*.log)           # Set the file pattern to be used, e.g. (*.txt) or (*)
num_files_per_tar=5 # Number of files per tar
num_procs=4         # Number of tar processes to start
tar_file_dir='/tmp' # Tar files dir
tar_file_name_prefix='tar' # prefix for tar file names
tar_file_name="$tar_file_dir/$tar_file_name_prefix"

# Main algorithm
#===============
num_tars=$((${#files[@]}/num_files_per_tar))  # the number of tar files to create
tar_files=()  # will hold the names of files for each tar

tar_start=0 # gets update where each tar starts
# Loop over the files adding their names to be tared
for i in `seq 0 $((num_tars-1))`
do
  tar_files[$i]="$tar_file_name$i.tar.bz2 ${files[@]:tar_start:num_files_per_tar}"
  tar_start=$((tar_start+num_files_per_tar))
done

# Start tar in parallel for each of the strings we just constructed
printf '%s\n' "${tar_files[@]}" | xargs -n$((num_files_per_tar+1)) -P$num_procs tar cjvf

Explanation

First, all the file names that match the selected pattern are stored in the array files. Next, the for loop slices this array and forms strings from the slices. The number of the slices is equal to the number of the desired tarballs. The resulting strings are stored in the array tar_files. The for loop also adds the name of the resulting tarball to the beginning of each string. The elements of tar_files take the following form (assuming 5 files/tarball):

tar_files[0]="tar0.tar.bz2  file1 file2 file3 file4 file5"
tar_files[1]="tar1.tar.bz2  file6 file7 file8 file9 file10"
...

The last line of the script, xargs is used to start multiple tar processes (up to the maximum specified number) where each one will process one element of tar_files array in parallel.

Test

List of files:

$ls

a      c      e      g      i      k      m      n      p      r      t
b      d      f      h      j      l      o      q      s

Generated Tarballs: $ls /tmp/tar* tar0.tar.bz2 tar1.tar.bz2 tar2.tar.bz2 tar3.tar.bz2

Related Question