You can load a custom file into your Bash history itself using the history
command. history -r
will read the file you name in as your command history, or the file named in HISTFILE
if you don't name one:
history -r ~/custom
After this you can use any of the ordinary actions that access the history with the data from your custom file, including Ctrl-R, !
history expansion, and so forth.
If you only want to have this enabled temporarily, you can save your current history, read the custom file, and then read the ordinary file back at the end:
$ history -w # Or -a if you prefer
$ history -r ~/custom
...
$ history -cr
history -cr
will clear the history in memory and read it in fresh from the standard history file $HISTFILE
. You can also make aliases to shorten this process.
Alternatively, you can add computed values to the history using history -s
. The following is equivalent to just reading the file:
while read command
do
history -s "$command"
done < ~/custom
You can add as many arbitrary history entries as you want in this way, computed from whatever data you have available (e.g, read a list of server names and ports and use history -s "ssh -p $port $servername"
). This works well in a function. The same options to write, clear, and restore history apply here too.
As a final option, if you set HISTFILE
in the environment when starting bash
then it will honour that file as your history. You can launch a new shell from a script with HISTFILE
set appropriately to let the user pick a command: HISTFILE=~/custom bash
. With careful construction you could make that shell terminate immediately after running the selected command.
Moving outside Bash itself, rlwrap
is a tool to wrap up any other command-line tool in Readline. rlwrap -H ~/custom cmd...
will run a command with the given file of history. You can build that program to do exactly what you want, including selecting only the server name with user input and building the command up afterwards.
Just add the base64 encoding of newline (Cg==
) after each file name and pipe the whole thing to base64 -d
:
find . -name "*_*" -printf "%f\n" |
sed -n 's/_....-..-..\.pdf$/Cg==/p' |
base64 -d
With your approach, that would have to be something like:
find . -name "*_*" -printf "%f\0" |
sed -zn 's/_....-..-..\.pdf$//p' |
xargs -r0 sh -c '
for i do
echo "$i" | base64 -d
done' sh
as you need a shell to create those pipelines. But that would mean running several commands per file which would be quite inefficient.
Best Answer
You can get the last command in your history with the bash builtin
!!
and useecho -n
to print that command without a newline character at the end:The
!!
argument will expand to the actual command string and-n
makes sure the output contains no newline character.