This really depends on who you are trying to prevent from reading the script and what resources you are expecting the system to have.
One option is to simply use many different programs to do different parts of your script: shell, awk, sed, perl, etc. as well as lots of obscure parameters of tools, forcing the reader to constantly refer to man pages.
Even within a shell, you can create unnecessary functions and variables, making them interdependent in confusing ways. And, of course, give them misleading names.
More complicated, you can append binary data to the end of your shell and have your shell extract and execute the binary. I believe nVidia's Linux drivers, and Sun's JDK are installed this way (the binary data is an RPM, which the shell extracts and installs). Another example I just downloaded the other day is the soapUI program.
In that vein, it is possible to have a text file that can be compiled or interpreted in multiple languages, so it could start as a shell, compile itself as a C program and execute the result. The IOCCC has some examples.
The uninterpreted shell arguments are $1
, $2
, etc. You need to put their expansion in double quotes in most contexts, to avoid the value of the parameter being expanded further. "$@"
gives you the list of all parameters.
For example, if you want to pass an argument of the shell script to your function, call it like this:
first_argument_as_filename_in_unix_syntax=$(posix "$1")
The double quotes are necessary. If you write posix $1
, then what you're passing is not the value of the first parameter but the result of performing word splitting and globbing on the value of the first parameter. You will need to use proper quoting when calling the script, too. For example, if you write this in bash:
myscript c:\path with spaces\somefile
then the actual, uninterpreted arguments to myscript
will be c:path
, with
and spacessomefile
. So don't do this.
Your posix
function is wrong, again because it lacks double quotes around $1
. Always put double quotes around variable and command substitutions: "$foo"
, "$(foo)"
. It's easier to remember this rule than the exceptions where you don't actually need the quotes.
echo
does its own processing in some cases, and calling external processes is slow (especially on Windows). You can do the whole processing inside bash.
posix () {
path="${1//\\//}"
case "$path" in
?:*) drive="${p:0:1}"; drive="${drive,}"; p="/$drive/${p:2}";;
esac
printf %s "$p"
}
The zsh feature that jw013 alluded to doesn't do what you seem to think it does. You can put noglob
in front of a command, and zsh does not perform globbing (i.e. filename generation, i.e. expansion of wildcards) on the arguments. For example, in zsh, if you write noglob locate *foo*bar*
, then locate
is called with the argument *foo*bar*
. You'd typically hide the noglob
builtin behind an alias. This feature is irrelevant for what you're trying to do.
Best Answer
It's impossible to generically “bypass the pipe”, as in, send output to wherever the output would go if there wasn't a pipe. However, it's possible to bypass the pipe in the sense of redirecting output to another place of your choice, such as the terminal. The special file
/dev/tty
always represents the current terminal.It's also possible to bypass the pipe if you “save” the original location and pass it down via a file descriptor. But you can't do that from the
ps_wrapper
function.There are many ways to avoid creating a temporary file. I'll mention a few. Unless otherwise indicated, the solutions in this answer work in bash if you add double quotes around variable and command substitutions.
If you're willing to change the way you call the function, you can call
head
andgrep
successively on the right-hand side of the pipe.head
will read and print the first line, and leave the rest for its successor to consume.You can a process substitution with either
tee
or zsh's built-intee
-like feature (multios
) to duplicate the output, sending one stream tohead -n 1
and another to the command of your choice. However, if you just pipe each stream to a command, there'll be a race between the two, and ifhead
isn't quick enough, the first line may not end up at the top. It'll probably often work becausehead
is pretty quick, but there's no guarantee, for example ifgrep
is in the disk cache buthead
isn't.You can use awk to display the first line, then pass the rest through.
Explanations:
NR
is the line number.print
with no argument prints the input line.Another approach would be to build the filter functionality into a single command. Pick a string that won't appear in the arguments to
ps
to use as a separator, for example|
.Instead of using grep, you can use
ps
's matching facilities such as-C
to match a process by its command name.Instead of using grep, you can use
pgrep
's matching facilities.