Using eval
is often discouraged because it allows execution of arbitrary code. However, if we use eval echo
, then it looks like the rest of the string will become arguments of echo
so it should be safe. Am I correct on this?
Bash – Is it Always Safe to Use `eval echo`?
bashecho
Related Solutions
Your code should be safe as echo
won't show up in the process table since it's a shell built-in.
Here's an alternative solution:
#!/bin/bash
n=20
paste -d : <( seq -f 'student%.0f' 1 "$n" ) \
<( tr -cd 'A-Za-z0-9' </dev/urandom | fold -w 13 | head -n "$n" ) |
tee secret.txt | chpasswd
This creates your student names and passwords, n
of them, without passing any passwords on any command line of any command.
The paste
utility glues together several files as columns and inserts a delimiter in-between them. Here, we use :
as the delimiter and give it two "files" (process substitutions). The first one contains the output of a seq
command that creates 20 student usernames, and the second contains the output of a pipeline that creates 20 random strings of length 13.
If you have a file with usernames already generated:
#!/bin/bash
n=$(wc -l <usernames.txt)
paste -d : usernames.txt \
<( tr -cd 'A-Za-z0-9' </dev/urandom | fold -w 13 | head -n "$n" ) |
tee secret.txt | chpasswd
These will save the passwords and usernames to the file secret.txt
instead of showing the generated passwords in the terminal.
You don't need eval
here. You can just use "$@"
:
while
"${@}"
exitcode="${?}"
[[ "${exitcode}" -ne 0 && "${try}" -lt "${tryMax}" ]]
do ...
"$@"
will expand to all the arguments to the script as separate "words" - respecting the original quoting that prevented word splitting - and then leave you with the first argument as the command waiting to run (cat
), and the remaining arguments as arguments to cat
(docu ment
).
Where this won't work:
- If you want the command passed in to use other higher-level shell constructs, like pipes, function definitions, loops, etc. These are all processed before parameter expansion, and won't be attempted again after
"$@"
is expanded. - If the command has its return code negated
! cmd
.!
is also processed at the start of handling a command, before parameter expansion. - If the command is multiple commands
x \; y
or$'x\ny'
orx $'\n' y
, or the same with&&
or||
. All those are just regular arguments. - If your command has variable assignments preceding it like
LD_LIBRARY_PATH=/x foo
. You can put them before the script name, but not the argument command. - If the command has redirections
>foo
,3<bar
in it. These may or may not be able to be affixed to the script, since the script has its own logging output. - If the command has here-documents or here-strings. These will only be readable one time if affixed to the script itself, so depending on exactly when the command fails you might or might not be all right. These would be very difficult to pass in suitably for
eval
anyway. - If the command is a subshell
( ... )
or command group{ ... ; }
. These will be treated as commands called(
and{
, not as syntactic constructs. - If the command contains command substitution
$(...)
that needs to be run repeatedly. You can use that (or any other shell construct) to generate the original arguments, but they will all be fixed strings once the script starts running. - If the command has any other element that should be evaluated repeatedly, like
$RANDOM
or an arithmetic expansion$((i++))
. - If the command is
time
. This is a shell reserved word, and not a built-in command, so it is also processed before parameter expansion.
Otherwise, however, you can successfully avoid eval
entirely, and should probably do so. It's very fragile to construct correctly even ignoring any possible security issues.
Best Answer
Counterexample:
The arbitrary arguments to
echo
could have done something more nefarious than creating a file called "foo".