Early shells had only a single data type: strings. But it is common to manipulate lists of strings, typically when passing multiple file names as arguments to a program. Another common use case for splitting is when a command outputs a list of results: the command's output is a string, but the desired data is a list of strings. To store a list of file names in a variable, you would put spaces between them. Then a shell script like this
files="foo bar qux"
myprogram $files
called myprogram
with three arguments, as the shell split the string $files
into words. At the time, spaces in file names were either forbidden or widely considered Not Done.
The Korn shell introduced arrays: you could store a list of strings in a variable. The Korn shell remained compatible with the then-established Bourne shell, so bare variable expansions kept undergoing word splitting, and using arrays required some syntactic overhead. You would write the snippet above
files=(foo bar qux)
myprogram "${files[@]}"
Zsh had arrays from the start, and its author opted for a saner language design at the expense of backward compatibility. In zsh (under the default expansion rules) $var
does not perfom word splitting; if you want to store a list of words in a variable, you are meant to use an array; and if you really want word splitting, you can write $=var
.
files=(foo bar qux)
myprogram $files
These days, spaces in file names are something you need to cope with, both because many users expect them to work and because many scripts are executed in security-sensitive contexts where an attacker may be in control of file names. So automatic word splitting is often a nuisance; hence my general advice to always use double quotes, i.e. write "$foo"
, unless you understand why you need word splitting in a particular use case. (Note that bare variable expansions undergo globbing as well.)
When instructed to echo commands as they are executed ("execution trace"), both bash
and ksh
add single quotes around any word with meta-characters (*
, ?
, ;
, etc.) in it.
The meta-characters could have gotten into the word in a variety of ways. The word (or part of it) could have been quoted with single or double quotes, the characters could have been escaped with a \
, or they remained as the result of a failed filename matching attempt. In all cases, the execution trace will contain single-quoted words, for example:
$ set -x
$ echo foo\;bar
+ echo 'foo;bar'
This is just an artifact of the way the shells implement the execution trace; it doesn't alter the way the arguments are ultimately passed to the command. The quotes are added, printed, and discarded. Here is the relevant part of the bash
source code, print_cmd.c
:
/* A function to print the words of a simple command when set -x is on. */
void
xtrace_print_word_list (list, xtflags)
...
{
...
for (w = list; w; w = w->next)
{
t = w->word->word;
...
else if (sh_contains_shell_metas (t))
{
x = sh_single_quote (t);
fprintf (xtrace_fp, "%s%s", x, w->next ? " " : "");
free (x);
}
As to why the authors chose to do this, the code there doesn't say. But here's some similar code in variables.c
, and it comes with a comment:
/* Print the value cell of VAR, a shell variable. Do not print
the name, nor leading/trailing newline. If QUOTE is non-zero,
and the value contains shell metacharacters, quote the value
in such a way that it can be read back in. */
void
print_var_value (var, quote)
...
{
...
else if (quote && sh_contains_shell_metas (value_cell (var)))
{
t = sh_single_quote (value_cell (var));
printf ("%s", t);
free (t);
}
So possibly it's done so that it's easier to copy the command lines from the output of the execution trace and run them again.
Best Answer
Preamble
First, I'd say it's not the right way to address the problem. It's a bit like saying "you should not murder people because otherwise you'll go to jail".
Similarly, you don't quote your variable because otherwise you're introducing security vulnerabilities. You quote your variables because it is wrong not to (but if the fear of the jail can help, why not).
A little summary for those who've just jumped on the train.
In most shells, leaving a variable expansion unquoted (though that (and the rest of this answer) also applies to command substitution (
`...`
or$(...)
) and arithmetic expansion ($((...))
or$[...]
)) has a very special meaning. The best way to describe it is that it is like invoking some sort of implicit split+glob operator¹.in another language would be written something like:
$var
is first split into a list of words according to complex rules involving the$IFS
special parameter (the split part) and then each word resulting of that splitting is considered as a pattern which is expanded to a list of files that match it (the glob part).As an example, if
$var
contains*.txt,/var/*.xml
and$IFS
contains,
,cmd
would be called with a number of arguments, the first one beingcmd
and the next ones being thetxt
files in the current directory and thexml
files in/var
.If you wanted to call
cmd
with just the two literal argumentscmd
and*.txt,/var/*.xml
, you'd write:which would be in your other more familiar language:
What do we mean by vulnerability in a shell?
After all, it's been known since the dawn of time that shell scripts should not be used in security-sensitive contexts. Surely, OK, leaving a variable unquoted is a bug but that can't do that much harm, can it?
Well, despite the fact that anybody would tell you that shell scripts should never be used for web CGIs, or that thankfully most systems don't allow setuid/setgid shell scripts nowadays, one thing that shellshock (the remotely exploitable bash bug that made the headlines in September 2014) revealed is that shells are still extensively used where they probably shouldn't: in CGIs, in DHCP client hook scripts, in sudoers commands, invoked by (if not as) setuid commands...
Sometimes unknowingly. For instance
system('cmd $PATH_INFO')
in aphp
/perl
/python
CGI script does invoke a shell to interpret that command line (not to mention the fact thatcmd
itself may be a shell script and its author may have never expected it to be called from a CGI).You've got a vulnerability when there's a path for privilege escalation, that is when someone (let's call him the attacker) is able to do something he is not meant to.
Invariably that means the attacker providing data, that data being processed by a privileged user/process which inadvertently does something it shouldn't be doing, in most of the cases because of a bug.
Basically, you've got a problem when your buggy code processes data under the control of the attacker.
Now, it's not always obvious where that data may come from, and it's often hard to tell if your code will ever get to process untrusted data.
As far as variables are concerned, In the case of a CGI script, it's quite obvious, the data are the CGI GET/POST parameters and things like cookies, path, host... parameters.
For a setuid script (running as one user when invoked by another), it's the arguments or environment variables.
Another very common vector is file names. If you're getting a file list from a directory, it's possible that files have been planted there by the attacker.
In that regard, even at the prompt of an interactive shell, you could be vulnerable (when processing files in
/tmp
or~/tmp
for instance).Even a
~/.bashrc
can be vulnerable (for instance,bash
will interpret it when invoked overssh
to run aForcedCommand
like ingit
server deployments with some variables under the control of the client).Now, a script may not be called directly to process untrusted data, but it may be called by another command that does. Or your incorrect code may be copy-pasted into scripts that do (by you 3 years down the line or one of your colleagues). One place where it's particularly critical is in answers in Q&A sites as you'll never know where copies of your code may end up.
Down to business; how bad is it?
Leaving a variable (or command substitution) unquoted is by far the number one source of security vulnerabilities associated with shell code. Partly because those bugs often translate to vulnerabilities but also because it's so common to see unquoted variables.
Actually, when looking for vulnerabilities in shell code, the first thing to do is look for unquoted variables. It's easy to spot, often a good candidate, generally easy to track back to attacker-controlled data.
There's an infinite number of ways an unquoted variable can turn into a vulnerability. I'll just give a few common trends here.
Information disclosure
Most people will bump into bugs associated with unquoted variables because of the split part (for instance, it's common for files to have spaces in their names nowadays and space is in the default value of IFS). Many people will overlook the glob part. The glob part is at least as dangerous as the split part.
Globbing done upon unsanitised external input means the attacker can make you read the content of any directory.
In:
if
$unsanitised_external_input
contains/*
, that means the attacker can see the content of/
. No big deal. It becomes more interesting though with/home/*
which gives you a list of user names on the machine,/tmp/*
,/home/*/.forward
for hints at other dangerous practises,/etc/rc*/*
for enabled services... No need to name them individually. A value of/* /*/* /*/*/*...
will just list the whole file system.Denial of service vulnerabilities.
Taking the previous case a bit too far and we've got a DoS.
Actually, any unquoted variable in list context with unsanitized input is at least a DoS vulnerability.
Even expert shell scripters commonly forget to quote things like:
:
is the no-op command. What could possibly go wrong?That's meant to assign
$1
to$QUERYSTRING
if$QUERYSTRING
was unset. That's a quick way to make a CGI script callable from the command line as well.That
$QUERYSTRING
is still expanded though and because it's not quoted, the split+glob operator is invoked.Now, there are some globs that are particularly expensive to expand. The
/*/*/*/*
one is bad enough as it means listing directories up to 4 levels down. In addition to the disk and CPU activity, that means storing tens of thousands of file paths (40k here on a minimal server VM, 10k of which directories).Now
/*/*/*/*/../../../../*/*/*/*
means 40k x 10k and/*/*/*/*/../../../../*/*/*/*/../../../../*/*/*/*
is enough to bring even the mightiest machine to its knees.Try it for yourself (though be prepared for your machine to crash or hang):
Of course, if the code is:
Then you can fill up the disk.
Just do a google search on shell cgi or bash cgi or ksh cgi, and you'll find a few pages that show you how to write CGIs in shells. Notice how half of those that process parameters are vulnerable.
Even David Korn's own one is vulnerable (look at the cookie handling).
up to arbitrary code execution vulnerabilities
Arbitrary code execution is the worst type of vulnerability, since if the attacker can run any command, there's no limit on what he may do.
That's generally the split part that leads to those. That splitting results in several arguments to be passed to commands when only one is expected. While the first of those will be used in the expected context, the others will be in a different context so potentially interpreted differently. Better with an example:
Here, the intention was to assign the content of the
$external_input
shell variable to thefoo
awk
variable.Now:
The second word resulting of the splitting of
$external_input
is not assigned tofoo
but considered asawk
code (here that executes an arbitrary command:uname
).That's especially a problem for commands that can execute other commands (
awk
,env
,sed
(GNU one),perl
,find
...) especially with the GNU variants (which accept options after arguments). Sometimes, you wouldn't suspect commands to be able to execute others likeksh
,bash
orzsh
's[
orprintf
...If we create a directory called
x -o yes
, then the test becomes positive, because it's a completely different conditional expression we're evaluating.Worse, if we create a file called
x -a a[0$(uname>&2)] -gt 1
, with all ksh implementations at least (which includes thesh
of most commercial Unices and some BSDs), that executesuname
because those shells perform arithmetic evaluation on the numerical comparison operators of the[
command.Same with
bash
for a filename likex -a -v a[0$(uname>&2)]
.Of course, if they can't get arbitrary execution, the attacker may settle for lesser damage (which may help to get arbitrary execution). Any command that can write files or change permissions, ownership or have any main or side effect could be exploited.
All sorts of things can be done with file names.
And you end up making
..
writeable (recursively with GNUchmod
).Scripts doing automatic processing of files in publicly writable areas like
/tmp
are to be written very carefully.What about
[ $# -gt 1 ]
That's something I find exasperating. Some people go down all the trouble of wondering whether a particular expansion may be problematic to decide if they can omit the quotes.
It's like saying. Hey, it looks like
$#
cannot be subject to the split+glob operator, let's ask the shell to split+glob it. Or Hey, let's write incorrect code just because the bug is unlikely to be hit.Now how unlikely is it? OK,
$#
(or$!
,$?
or any arithmetic substitution) may only contain digits (or-
for some²) so the glob part is out. For the split part to do something though, all we need is for$IFS
to contain digits (or-
).With some shells,
$IFS
may be inherited from the environment, but if the environment is not safe, it's game over anyway.Now if you write a function like:
What that means is that the behaviour of your function depends on the context in which it is called. Or in other words,
$IFS
becomes one of the inputs to it. Strictly speaking, when you write the API documentation for your function, it should be something like:And code calling your function needs to make sure
$IFS
doesn't contain digits. All that because you didn't feel like typing those 2 double-quote characters.Now, for that
[ $# -eq 2 ]
bug to become a vulnerability, you'd need somehow for the value of$IFS
to become under control of the attacker. Conceivably, that would not normally happen unless the attacker managed to exploit another bug.That's not unheard of though. A common case is when people forget to sanitize data before using it in arithmetic expression. We've already seen above that it can allow arbitrary code execution in some shells, but in all of them, it allows the attacker to give any variable an integer value.
For instance:
And with a
$1
with value(IFS=-1234567890)
, that arithmetic evaluation has the side effect of settings IFS and the next[
command fails which means the check for too many args is bypassed.What about when the split+glob operator is not invoked?
There's another case where quotes are needed around variables and other expansions: when it's used as a pattern.
do not test whether
$a
and$b
are the same (except withzsh
) but if$a
matches the pattern in$b
. And you need to quote$b
if you want to compare as strings (same thing in"${a#$b}"
or"${a%$b}"
or"${a##*$b*}"
where$b
should be quoted if it's not to be taken as a pattern).What that means is that
[[ $a = $b ]]
may return true in cases where$a
is different from$b
(for instance when$a
isanything
and$b
is*
) or may return false when they are identical (for instance when both$a
and$b
are[a]
).Can that make for a security vulnerability? Yes, like any bug. Here, the attacker can alter your script's logical code flow and/or break the assumptions that your script are making. For instance, with a code like:
The attacker can bypass the check by passing
'[a]' '[a]'
.Now, if neither that pattern matching nor the split+glob operator apply, what's the danger of leaving a variable unquoted?
I have to admit that I do write:
There, quoting doesn't harm but is not strictly necessary.
However, one side effect of omitting quotes in those cases (for instance in Q&A answers) is that it can send a wrong message to beginners:
that it may be all right not to quote variables.For instance, they may start thinking that if
a=$b
is OK, thenwould be as well (which it's not in many shells as it's in arguments to theexport a=$b
export
command so in list context) or.env a=$b
What about
zsh
?zsh
did fix most of those design awkwardnesses. Inzsh
(at least when not in sh/ksh emulation mode), if you want splitting, or globbing, or pattern matching, you have to request it explicitly:$=var
to split, and$~var
to glob or for the content of the variable to be treated as a pattern.However, splitting (but not globbing) is still done implicitly upon unquoted command substitution (as in
echo $(cmd)
).Also, a sometimes unwanted side effect of not quoting variable is the empties removal. The
zsh
behaviour is similar to what you can achieve in other shells by disabling globbing altogether (withset -f
) and splitting (withIFS=''
). Still, in:There will be no split+glob, but if
$var
is empty, instead of receiving one empty argument,cmd
will receive no argument at all.That can cause bugs (like the obvious
[ -n $var ]
). That can possibly break a script's expectations and assumptions and cause vulnerabilities.As the empty variable can cause an argument to be just removed, that means the next argument could be interpreted in the wrong context.
As an example,
If
$attacker_supplied1
is empty, then$attacker_supplied2
will be interpreted as an arithmetic expression (for%d
) instead of a string (for%s
) and any unsanitized data used in an arithmetic expression is a command injection vulnerability in Korn-like shells such as zsh.fine, but:
The
uname
arbitrary command was run.What about when you do need the split+glob operator?
Yes, that's typically when you do want to leave your variable unquoted. But then you need to make sure you tune your split and glob operators correctly before using it. If you only want the split part and not the glob part (which is the case most of the time), then you do need to disable globbing (
set -o noglob
/set -f
) and fix$IFS
. Otherwise you'll cause vulnerabilities as well (like David Korn's CGI example mentioned above).Conclusion
In short, leaving a variable (or command substitution or arithmetic expansion) unquoted in shells can be very dangerous indeed especially when done in the wrong contexts, and it's very hard to know which are those wrong contexts.
That's one of the reasons why it is considered bad practice.
Thanks for reading so far. If it goes over your head, don't worry. One can't expect everyone to understand all the implications of writing their code the way they write it. That's why we have good practice recommendations, so they can be followed without necessarily understanding why.
(and in case that's not obvious yet, please avoid writing security sensitive code in shells).
And please quote your variables on your answers on this site!
¹In
ksh93
andpdksh
and derivatives, brace expansion is also performed unless globbing is disabled (in the case ofksh93
versions up to ksh93u+, even when thebraceexpand
option is disabled).² In
ksh93
andyash
, arithmetic expansions can also include things like1,2
,1e+66
,inf
,nan
. There are even more inzsh
, including#
which is glob operator withextendedglob
, butzsh
never does split+glob upon arithmetic expansion, even insh
emulation