If using GNU mv
, you should rather do:
find . -type f -exec mv -t . {} +
With other mv
s:
find . -type f -exec sh -c 'exec mv "$@" .' sh {} +
You should never embed {}
in the sh
code. That's a command injection vulnerability as the names of the files are interpreted as shell code (try with a file called `reboot`
for instance).
Good point for quoting the command substitution, but because you used the archaic form (`...`
as opposed to $(...)
), you'd need to escape the inner double quotes or it won't work in sh
implementations based on the Bourne shell or AT&T ksh (where "`basename "foo bar"`"
would actually be treated as "`basename "
(with an unmatched `
which is accepted in those shells) concatenated with foo
and then bar"`"
).
Also, when you do:
mv foo/bar bar
If bar
actually existed and was a directory, that would actually be a mv foo/bar bar/bar
. mv -t . foo/bar
or mv foo/bar .
don't have that issue.
Now, to store those several arguments (-exec
, sh
, -c
, exec mv "$@" .
, sh
, {}
, +
) into a variable, you'd need an array variable. Shells supporting arrays are (t)csh
, ksh
, bash
, zsh
, rc
, es
, yash
, fish
.
And to be able to use that variable as just $FLATTEN
(as opposed to "${FLATTEN[@]}"
in ksh/bash/yash or $FLATTEN:q
in (t)csh
), you'd need a shell with a sane array implementation: rc
, es
or fish
. Also zsh
here as it happens none of those arguments is empty.
In rc
/es
/zsh
:
FLATTEN=(-exec sh -c 'exec mv "$@" .' sh '{}' +)
In fish
:
set FLATTEN -exec sh -c 'exec mv "$@" .' sh '{}' +
Then you can use:
find . -type f $FLATTEN
Best Answer
The important thing to understand here is that in most shells¹,
[
is just an ordinary command parsed by the shell like any other ordinary command.Then the shell invokes that
[
(akatest
) command with a list of arguments, and then it's up to[
to interpret them as a conditional expression.At that point, those are just a list of strings and the information about which ones resulted from some form of expansion is lost, even in those shells where
[
is built-in (all Bourne-like ones these days).The
[
utility used to have a hard time telling which ones of its arguments were operators and which ones were operands (the thing operators work on). It didn't help that the syntax was intrinsically ambiguous. For instance:[ -t ]
used to be (and still is in some shells/[
s) to test whether stdout is a terminal.[ x ]
is short for[ -n x ]
: test whetherx
is a non-empty string (so you can see there's a conflict with the above).[
s,-a
and-o
can be both unary ([ -a file ]
for accessible file (now replaced by[ -e file ]
),[ -o option ]
for is the option enabled?) and binary operators (and and or). Again,! -a x
can be eitherand(nonempty("!"), nonempty("x"))
ornot(isaccessible("x"))
.(
,)
and!
add more problems.In normal programming languages like C or
perl
, in:There's no way the content of
$a
or$b
will be taken as operators because the conditional expression is parsed before those$a
and$b
are expanded. But in shells, in:The shell expands the variables first². For instance, if
$a
contains(
and$b
contains)
, all the[
command sees is[
,(
,=
,)
and]
arguments. So does that means"(" = ")"
(are(
and)
lexically equal) or( -n = )
(is=
a non-empty string).Historical implementations (
test
appeared in Unix V7 in the late 70s) used to fail even in cases where it was not ambiguous just because of the order in which they were processing their arguments.Here with version 7 Unix in a PDP11 emulator:
Most shell and
[
implementations have or have had problems with those or variants thereof. Withbash
4.4 today:POSIX.2 (published in the early 90s) devised an algorithm that would make
[
's behaviour unambiguous and deterministic when passed at most 4 arguments (beside[
and]
) in the most common usage patterns ([ -f "$a" -o "$b" ]
still unspecified for instance). It deprecated(
,)
,-a
and-o
, and dropped-t
without operand.bash
did implement that algorithm (or at least tried to) inbash
2.0.So, in POSIX compliant
[
implementations,[ "$a" = "$b" ]
is guaranteed to compare the content of$a
and$b
for equality, whatever they are. Without-o
, we would write:That is, call
[
twice, each time with fewer than 5 arguments.But it took quite a while for all
[
implementations to become compliant.bash
's was not compliant until 4.4 (though the last problem was for[ '(' ! "$var" ')' ]
which nobody would really use in real life)The
/bin/sh
of Solaris 10 and older, which is not a POSIX shell, but a Bourne shell still has problems with[ "$a" = "$b" ]
:Using
[ "x$a" = "x$b" ]
works around the problem as there is no[
operator that starts withx
. Another option is to usecase
instead:(quoting is necessary around
$b
, not around$a
).In any case, it is not and never has been about empty values. People have problems with empty values in
[
when they forget to quote their variables, but that's not a problem with[
then.with the default value of
$IFS
becomes:Which is a test of whether
=
orx
is a non-empty string, but no amount of prefixing will help³ as[ x$a = x$b ]
will still be:[ x = x-o x ]
which would cause an error, and it could get a lot worse including DoS and arbitrary command injection with other values like inbash
:The correct solution is to always quote:
Note that
expr
has similar (and even worse) problems.expr
also has a=
operator, though it's for testing whether the two operands are equal integers when they look like decimal integer numbers, or sort the same when not.In many implementations,
expr + = +
, orexpr '(' = ')'
orexpr index = index
don't do equality comparison.expr "x$a" = "x$b"
would work around it for string comparison, but prefixing with anx
could affect the sorting (in locales that have collating elements starting withx
for instance) and obviously can't be used for number comparisonexpr "0$a" = "0$b"
doesn't work for comparing negative integers.expr " $a" = " $b"
works for integer comparison in some implementations, but not others (fora=01 b=1
, some would return true, some false).¹
ksh93
is an exception. Inksh93
,[
can be seen as a reserved word in that[ -t ]
is actually different fromvar=-t; [ "$var" ]
, or from""[ -t ]
orcmd='['; "$cmd" -t ]
. That's to preserve backward compatibility and still be POSIX compliant in cases where it matters. The-t
is only taken as an operator here if it's literal, andksh93
detects that you're calling the[
command.² ksh added a
[[...]]
conditional expression operator with its own syntax parsing rules (and some problems of its own) to address that (also found in some other shells, with some differences).³ except in
zsh
where split+glob is not invoked upon parameter expansion, but empty removal still is, or in other shells when disabling split+glob globally withset -o noglob; IFS=