Bash echo the command line executed at the command line itself (not in a script)

bashcommand lineechoio-redirection

For documentation purposes, I mean to redirect to file stdout and stderr from a command I execute.
For instance, I would run (my command is less trivial than the alias ll but it probably doesn't matter):

$ ll > out-err.dat 2>&1
$ cat out-err.dat
drwxr-xr-x 39 us00001 us00001    4096 jul 31 14:57 ./
drwxr-xr-x  3 root    root       4096 feb  2 06:06 ../
-rw-------  1 us00001 us00001   62226 jul 31 11:56 .bash_history
...

Also for documentation purposes, I want to store in the same output file the command line I used.
The intended behaviour and output is

$ [NEW COMMAND LINE]?
$ cat out-err.dat
[NEW COMMAND LINE]   <- This first line would contain the command line used
drwxr-xr-x 39 us00001 us00001    4096 jul 31 14:57 ./
drwxr-xr-x  3 root    root       4096 feb  2 06:06 ../
-rw-------  1 us00001 us00001   62226 jul 31 11:56 .bash_history
...

How can this be done?
I know I could write a bash script and execute it, so the command would remain documented separately.
I could further write a script to echo the command line to file and then execute it with redirection to the same file.
I am looking for a possible script-less solution.


EDIT:
Feedback on a nice answer.
This wouldn't fit as a comment.

I tested with command echo_command ll echo_command.sh ../dir > out-err.dat 2>&1.

Script echo_command.sh, which I source, contains the definitions of the functions.

../dir is a non-existing dir, to force some output to stderr.

Method 1:
Works nice, except for two issues:

  • It doesn't understand aliases (ll in this case; when replacing with ls it worked).

  • It doesn't record the redirection part.

Method 2:
It doesn't work that well. Now the redirection part is also printed, but the command line is printed to screen instead of redirected to file.


EDIT:
Feedback on a comment posted, about a script utility.
It is quite versatile, even more so with scriptreplay.

script can be called alone, which produces an interactive shell (it wouldn't keep the recent history of the parent shell)

It can also be called as script -c <command> <logfile>. This last form corresponds with the objective of the OP, but it doesn't store the command itself into the log file. It produces (at least in base cases) the same output as <command> > <logfile> 2>&1.

So it seems this is not useful here.

Best Answer

You could use a function like this:

echo_command() { printf '%s\n' "${*}"; "${@}"; }

Example:

$ echo_command echo foo bar baz
echo foo bar baz
foo bar baz
$ echo_command uname
uname
Linux

As Tim Kennedy said, there's also a very useful script command:

$ script session.log
Script started, file is session.log
$ echo foo
foo
$ echo bar
bar
$ echo baz
baz
$ exit
exit
Script done, file is session.log
$ cat session.log
Script started on 2018-07-31 16:30:31-0500
$ echo foo
foo
$ echo bar
bar
$ echo baz
baz
$ exit
exit

Script done on 2018-07-31 16:30:43-0500

Update

If you also need to log the redirections and basically every shell syntax (note that I added a little Command line: message to easily identify the command being executed):

echo_command() {
  local arg
  for arg; do
    printf 'Command line: %s\n' "${arg}"
    eval "${arg}"
  done
}

Just take into account that you should be very careful with the quoting as eval is used:

$ echo_command 'echo foo > "file 2"; cat "file 2"'
Command line: echo foo > "file 2"; cat "file 2"
foo

It also accepts many commands at once instead of only one:

$ echo_command 'echo foo' 'echo bar' 'echo baz'
Command line: echo foo
foo
Command line: echo bar
bar
Command line: echo baz
baz
Related Question