Assuming you have your desired files in a text file, you could do something like
while IFS= read -r file; do
echo mkdir -p ${file%/*};
cp /source/"$file" /target/${file%/*}/${file##*/};
done < files.txt
That will read each line of your list, extract the directory and the file name, create the directory and copy the file. You will need to change source
and target
to the actual parent directories you are using. For example, to copy /foo/a/a.txt
to /bar/a/a.txt
, change source
to foo
and target
to bar
.
I can't tell from your question whether you want to copy all directories and then only specific files or if you just want the directories that will contain files. The solution above will only create the necessary directories. If you want to create all of them, use
find /source -type d -exec mkdir -p {} /target
That will create the directories. Once those are there, just copy the files:
while IFS= read -r file; do
cp /source/"$file" /target/"$file"
done
Update
This little script will move all the files modified after September 8. It assumes the GNU versions of find
and touch
. Assuming you're using Linux, that's what you will have.
#!/usr/bin/env bash
## Create a file to compare against.
tmp=$(mktemp)
touch -d "September 8" "$tmp"
## Define the source and target parent directories
source=/path/to/source
target=/path/to/target
## move to the source directory
cd "$source"
## Find the files that were modified more recently than $tmp and copy them
find ./ -type f -newer "$tmp" -printf "%h %p\0" |
while IFS= read -rd '' path file; do
mkdir -p "$target"/"$path"
cp "$file" "$target"/"$path"
done
Strictly speaking, you don't need the tmp file. However, this way, the same script will work tomorrow. Otherwise, if you use find's -mtime
, you would have to calculate the right date every day.
Another approach would be to first find the directories, create them in the target and then copy the files:
Create all directories
find ./ -type d -exec mkdir -p ../bar/{} \;
Find and copy the relevant files
find ./ -type f -newer "$tmp" -exec cp {} /path/to/target/bar/{} \;
Remove any empty directories
find ../bar/ -type d -exec rmdir {} \;
At least the GNU and FreeBSD find
s have the -ls
action, which produces output similar to ls
:
$ find . -ls
392815 4 drwxr-xr-x 2 user group 4096 Jul 22 18:39 .
392816 0 -rw-r--r-- 1 user group 0 Jul 22 18:39 ./foo.txt
392818 0 -rw-r--r-- 1 user group 0 Jul 22 18:39 ./bar.txt
GNU find also has very configurable output in the form of the -printf
action.
That said, I do wonder what makes your ls
so slow. Both find
and ls
need to read the whole directory and call lstat()
on all the files to find the dates, so there shouldn't be much of a difference. ls
does need to sort the whole list of files, so that could make a difference if there is a really large number of files. In that case, you might want to consider spreading the files out to different directories, possibly based on their date. Dropping the -r
and use head
instead of tail
might help.
Best Answer
This will take the recently-used files referenced in
~/.local/share/recently-used.xbel
(or rather,${XDG_DATA_HOME}/recently-used.xbel
), and link them all into a directory called~/recent
:This uses XMLStarlet to extract the file URIs from the list of recently-used documents (ignoring other URIs), feeds them to a Python script which replaces newlines with nul characters and then unquotes the escaped URIs (e.g.
+
or%20
instead of space), and finally feeds that toxargs
which splits all the file names and feeds them toln
(the GNU variant) to create symbolic links.Note that links will be created regardless of whether the target file still exists; it often happens that the list of recently-used files includes temporary files which have since been deleted.