Ubuntu – Copy over a large number of files with new names

bashcommand linecppythonserver

I have a large number of files (tens of thousands) I need to copy over from /dir1 to /dir2 but the name of the file must change as such:

OLD NAME —> NEW NAME

filename.txt —> bob_filename_1253.txt

Where bob is the uid that owns the file and 1253 is the last time it was modified.

I am currently achieving this with a python script that loops through every file and then cp's it to the new destination, creating the new file name through string slicing.

HOWEVER, this is taking a torturous amount of time. Is there a cleaner, faster way to achieve this?

Thanks in advance!

Best Answer

Assuming you are starting with an empty dir2 so that you can copy the files from dir1 to dir2 without conflict and then rename them, I'd try something like this:

printf '%s\0' dir1/*.txt | xargs -r0 cp -np -t dir2/

to perform the copy, then

cd dir2/

printf '%s\0' *.txt | xargs -r0 rename -n -- '
  $_ = join "_", ( getpwuid( (stat $_)[4] ) )[0], $_, (stat $_)[9]
'

to rename (remove the -n from the rename command once you are happy with the proposed changes).


Although it's most often used for simple s/pattern/replacement/ name changes, the Perl-based rename command available in current versions of Ubuntu can actually rename files based on pretty much arbitrary Perl expressions.

In this case we can use Perl's built in stat to get the numeric UID and mtime, and then use getpwuid to turn the UID to a username. (stat $_)[9] is the modification time (mtime) in epoch seconds - if you need it in another format you can use POSIX::strftime or one of several other time manipulation modules.