Is that size as in file size, or size as in image dimensions?
In zsh, to see all .png
files in the current directory and its subdirectories, sorted by increasing file size:
echo **/*.png(oL)
There's no convenient glob qualifier for grabbing every N files. Here's a loop that sets the array $a
to contain every 50th file (starting with the largest).
a=() i=0
for x in **/*.png(OL); do
((i%50)) || a+=$x
((++i))
done
my-favorite-image-viewer $a
Without zsh or GNU find, there's no easy way of sorting find
output by metadata (there's find -ls
or find -exec ls
or find -exec stat
, but they might not work with files containing non-printable characters, so I don't like to recommend them). Here's a way to do it in Perl.
find . -name '*.png' |
perl -e '
$, = "\n"; # separate elements by newlines in the output
print # print…
sort {-s $a <=> -s $b} # …sorted by file size…
map {chomp;$_} <> #…the input lines (with the newline bitten off)
'
And here's a way to view every 50th file (starting with the largest):
find . -name '*.png' |
perl -e '
$, = "\n";
exec "my-favorite-image-viewer",
map {$i++ % 50 ? () : $_} # every 50
sort {-s $b <=> -s $a} map {chomp;$_} <>
'
Another approach would be to create symbolic links in a single directory, with names ordered by file size. In zsh:
mkdir tmp && cd tmp
i=1000000 # the extra 1 on the left ensures alignment
for x in ../**/*(oL); do
((++i))
ln -s $x ${i#1}.png
done
With Perl:
mkdir tmp && cd tmp
find .. -name '*.png' |
perl -e '
$, = "\n";
for $x (sort {-s $a <=> -s $b} map {chomp;$_} <>) {
symlink $x, sprintf("%06d", ++$i);
}
'
You don't need any of the GNUisms here (and you probably want a -mindepth 1
to exclude .
), and you don't need to run one chmod
per file:
find . ! -name . -prune ! -type l -size +100c -size -1000c -print \
-exec chmod a+r {} + >testfile
(I've also added a ! -type l
because -size
would check the size of the symlink while chmod
will change the permissions of the target of the symlink so it doesn't make sense to consider symlinks. Chances are you'd want to go further and only consider regular files (-type f
))
That works here because chmod
doesn't output anything on its stdout (which otherwise would end-up in testfile).
More generally, to avoid that, you'd need to do:
find . ! -name . -prune ! -type l -size +100c -size -1000c -print -exec sh -c '
exec cmd-that-may-write-to-stdout "$@" >&3 3>&-' sh {} + 3>&1 > testfile
So that find
's stdout goes to testfile
but cmd-that-may-write-to-stdout
's stdout goes to the original stdout before redirection (as saved with 3>&1
above).
Note that in your:
find . -maxdepth 1 -size +100c -size -1000c -exec chmod a+r {} \; -print > testfile
testfile
would contain the files for which chmod
has succeeded (the -print
being after -exec
means -exec
is another condition for that -print
, and -exec
succeeds if the executed command returns with a non-zero exit status).
If you wanted to use xargs
(here using GNU syntax), you could use tee
and process substitution:
find . ! -name . -prune ! -type l -size +100c -size -1000c -print0 |
tee >(tr '\0' '\n' > testfile) |
xargs -r0 chmod a+r
To save the output of find
with NULs turned into newlines into testfile
. Note however that that tr
command is running in background. Your shell will wait for xargs
(at least, most shells will also wait for tee
and find
), but not for tr
. So there's a little chance that tr
has finished writing data to testfile
by the time the shell runs the next command. If it's more important that the testfile
be fully written by then than all the permissions be modified, you may want to swap the xargs
and tr
commands above.
Another options is to wrap the whole code above in:
(<that-code>) 3>&1 | cat
That way, the shell will wait for cat
and that cat
will only exit when all the processes that have that file descriptor 3 open on the writing end of the pipe it reads (which includes tr
, find
, tee
, xargs
) have exited.
Another option is to use zsh
globs here:
files=(./*(L+100L-1000^@))
chmod a+r $files
print -rl $files > testfile
Though you could run into a too many arguments errors if the list of files is very big. find -exec +
and xargs
work around that by running several chmod
commands if needed. You can use zargs
in zsh
for that.
Best Answer
i know this is a bit overkill but, this will work every time (even if there are spaces in your filename) and regardless of how file displays the information.
and it prints the dimensions of the picture and the file
explaination:
find
all files named *.png under . and for each do a file on ituse
sed
to print only the filename and dimensions then re-order to print dimensions firstuse
awk
to test the first number (height of pic) making sure its greater than 500 and if it is print dimensions and file name, if not do nothing.