ssh
has an annoying feature in that when you run:
ssh user@host cmd and "here's" "one arg"
Instead of running that cmd
with its arguments on host
, it concatenates that cmd
and arguments with spaces and runs a shell on host
to interpret the resulting string (I guess that's why its called ssh
and not sexec
).
Worse, you don't know what shell is going to be used to interpret that string as that's the login shell of user
which is not even guaranteed to be Bourne like as there are still people using tcsh
as their login shell and fish
is on the rise.
Is there a way around that?
Suppose I have a command as a list of arguments stored in a bash
array, each of which may contain any sequence of non-null bytes,
is there any way to have it executed on host
as user
in a consistent way regardless of the login shell of that user
on host
(which we'll assume is one of the major Unix shell families: Bourne, csh, rc/es, fish)?
Another reasonable assumption that I should be able to make is that there be a sh
command on host
available in $PATH
that is Bourne-compatible.
Example:
cmd=(
'printf'
'<%s>\n'
'arg with $and spaces'
'' # empty
$'even\n* * *\nnewlines'
"and 'single quotes'"
'!!'
)
I can run it locally with ksh
/zsh
/bash
/yash
as:
$ "${cmd[@]}"
<arg with $and spaces>
<>
<even
* * *
newlines>
<and 'single quotes'>
<!!>
or
env "${cmd[@]}"
or
xterm -hold -e "${cmd[@]}"
...
How would I run it on host
as user
over ssh
?
ssh user@host "${cmd[@]}"
obviously won't work.
ssh user@host "$(printf ' %q' exec "${cmd[@]}")"
would only work if the login shell of the remote user was the same as the local shell (or understands quoting in the same way as printf %q
in the local shell produces it) and runs in the same locale.
Best Answer
I don't think any implementation of
ssh
has a native way to pass a command from client to server without involving a shell.Now, things can get easier if you can tell the remote shell to only run a specific interpreter (like
sh
, for which we know the expected syntax) and give the code to execute by another mean.That other mean can be for instance standard input or an environment variable.
When neither can be used, I propose a hacky third solution below.
Using stdin
If you don't need to feed any data to the remote command, that's the easiest solution.
If you know the remote host has an
xargs
command that supports the-0
option and the command is not too large, you can do:That
xargs -0 env --
command line is interpreted the same with all those shell families.xargs
reads the null-delimited list of arguments on stdin and passes those as arguments toenv
. That assumes the first argument (the command name) does not contain=
characters.Or you can use
sh
on the remote host after having quoted each element usingsh
quoting syntax.Using environment variables
Now, if you do need to feed some data from the client to the remote command's stdin, the above solution won't work.
Some
ssh
server deployments however allow passing of arbitrary environment variables from the client to the server. For instance, many openssh deployments on Debian based systems allow passing variables whose name starts withLC_
.In those cases you could have a
LC_CODE
variable for instance containing the shquotedsh
code as above and runsh -c 'eval "$LC_CODE"'
on the remote host after having told your client to pass that variable (again, that's a command-line that's interpreted the same in every shell):Building a command line compatible to all shell families
If none of the options above are acceptable (because you need stdin and sshd doesn't accept any variable, or because you need a generic solution), then you'll have to prepare a command line for the remote host that is compatible with all supported shells.
That is particularly tricky because all those shells (Bourne, csh, rc, es, fish) have their own different syntax, and in particular different quoting mechanisms and some of them have limitations that are hard to work around.
Here is a solution I came up with, I describe it further down:
That's a
perl
wrapper script aroundssh
. I call itsexec
. You call it like:so in your example:
And the wrapper turns
cmd and its args
into a command line that all shells end up interpreting as callingcmd
with its args (regarless of their content).Limitations:
yash
is the remote login shell, you can't pass a command whose arguments contain invalid characters, but that's a limitation inyash
that you can't work around anyway.sh
, it also assumes the remote system has theprintf
command.To understand how it works, you need to know how quoting works in the different shells:
'...'
are strong quotes with no special character in it."..."
are weak quotes where"
can be escaped with backslash.csh
. Same as Bourne except that"
cannot be escaped inside"..."
. Also a newline character has to be entered prefixed with a backslash. And!
causes problems even inside single quotes.rc
. The only quotes are'...'
(strong). A single quote within single quotes is entered as''
(like'...''...'
). Double quotes or backslashes are not special.es
. Same as rc except that outside quotes, backslash can escape a single quote.fish
: same as Bourne except that backslash escapes'
inside'...'
.With all those contraints, it's easy to see that one cannot reliably quote command line arguments so that it works with all shells.
Using single quotes as in:
works in all but:
would not work in
rc
.would not work in
csh
.would not work in
fish
.However we should be able to work around most of those problems if we manage to store those problematic characters in variables, like backslash in
$b
, single quote in$q
, newline in$n
(and!
in$x
for csh history expansion) in a shell independant way.would work in all shells. That would still not work for newline for
csh
though. If$n
contains newline, incsh
, you have to write it as$n:q
for it to expand to a newline and that won't work for other shells. So, what we end-up doing instead here is callingsh
and havesh
expand those$n
. That also means having to do two levels of quoting, one for the remote login shell, and one forsh
.The
$preamble
in that code is the trickiest part. It makes use of the various different quoting rules in all shells to have some sections of the code interpreted by only one of the shells (while it's commented out for the others) each of which just defining those$b
,$q
,$n
,$x
variables for their respective shell.Here's the shell code that would be interpreted by the login shell of the remote user on
host
for your example:That code ends up running the same command when interpreted by any of the supported shells.