Is that size as in file size, or size as in image dimensions?
In zsh, to see all .png
files in the current directory and its subdirectories, sorted by increasing file size:
echo **/*.png(oL)
There's no convenient glob qualifier for grabbing every N files. Here's a loop that sets the array $a
to contain every 50th file (starting with the largest).
a=() i=0
for x in **/*.png(OL); do
((i%50)) || a+=$x
((++i))
done
my-favorite-image-viewer $a
Without zsh or GNU find, there's no easy way of sorting find
output by metadata (there's find -ls
or find -exec ls
or find -exec stat
, but they might not work with files containing non-printable characters, so I don't like to recommend them). Here's a way to do it in Perl.
find . -name '*.png' |
perl -e '
$, = "\n"; # separate elements by newlines in the output
print # print…
sort {-s $a <=> -s $b} # …sorted by file size…
map {chomp;$_} <> #…the input lines (with the newline bitten off)
'
And here's a way to view every 50th file (starting with the largest):
find . -name '*.png' |
perl -e '
$, = "\n";
exec "my-favorite-image-viewer",
map {$i++ % 50 ? () : $_} # every 50
sort {-s $b <=> -s $a} map {chomp;$_} <>
'
Another approach would be to create symbolic links in a single directory, with names ordered by file size. In zsh:
mkdir tmp && cd tmp
i=1000000 # the extra 1 on the left ensures alignment
for x in ../**/*(oL); do
((++i))
ln -s $x ${i#1}.png
done
With Perl:
mkdir tmp && cd tmp
find .. -name '*.png' |
perl -e '
$, = "\n";
for $x (sort {-s $a <=> -s $b} map {chomp;$_} <>) {
symlink $x, sprintf("%06d", ++$i);
}
'
The command you claim to be executing doesn't match the error message you're getting, but either way this answer should clarify some things.
First, note that if there are any files matching *jpg
in the current directory, the pattern *jpg
will be expanded on the command line of find
. You need to quote the pattern to protect from that.
With the command you've given, what gets executed is something like
cd directory && mv -f file
But mv
expects two arguments: a source file and a destination.
You need to pass a target directory, and since you're using -execdir
, the target directory will be interpreted relative to each directory where there are .jpg
files. Note that the directory must exist. If you want to move all .jpg
files to a single directory, create it first, then run
find -iname '*.jpg' -execdir mv -f {} /common-destination-directory
If you want to move all files to a relative path, for example move them to the images
subdirectory relative to where they are, you will need to create the directory first.
find -iname '*.jpg' -execdir mkdir -p images \; -execdir mv -f {} images \;
With GNU utilities (i.e. on Linux) you can optimize a little by running mv
only once per directory:
find -iname '*.jpg' -execdir mkdir -p images \; -execdir mv -t images {} +
Best Answer
I don't know of any other way besides scanning the directory tree in question to collect the file sizes so that you can determine the largest file. If you know that there's a threshold of size you can instruct find to dismiss files that are below this threshold size.
Would dismiss any files below the size of 50MB. If you know these files are always in a specific location you can target your
find
to this area instead of scanning the entire disk.NOTE: This is a method that I typically employee since you shouldn't be getting random files in non
/var
types of directories, typically.As to
du
you can tell it to output the sizes in human readable formats using the-h
switch. Thesort
command knows how to sort these as well, again using its-h
switch.Example
The above
find
returns the list of files that are > 50MB using a null (\0
) character as the separator. Thedu
command takes this list and knows to split on nulls using the--files0-from=-
switch. This output is then sorted by its human formatted sizes.Without the
tail -1
: