This looks like a job for gnu parallel:
parallel bash -c ::: script_*
The advantage is that you don't have to group your scripts by cores, parallel
will do that for you.
Of course, if you don't want to babysit the SSH session while the scripts are running, you should use nohup
or screen
As you wish
Nearly hit. The correct syntax is:
(command11; command12; command13) &
(command21; command22; command23) &
(command31; command32; command33) &
(command41; command42; command43) &
As you wish, but better
Or, if you want inside a group the commands to be left out after a command failed, then the syntax is
command11 && command12 && command13 &
command21 && command22 && command23 &
command31 && command32 && command33 &
command41 && command42 && command43 &
Note, "&&
" has a very different meaning as "&
". "&&
" means, that the command will run only after the previous is ready, and only if it was executed without a failure (i.e. its exit code is zero). "&
" means that the command before it will be run parallel with the main execution line, and its failure or success doesn't matter.
Using the xargs
tool
However, these solutions have the disadvantage, that you can't wait after all of these commands were executed. To do that, we have a little bit more trickery. Specially in your case, the required command to do this would be
for i in 1 2 3 4 5 6 7 8 9 10
do
echo "command${i}1 && command${i}2 && command${i}3"
done | xargs -P 10 -l 1 bash -c
It uses the very useful xargs
tool. Of course you can pipe anything into it, for example if you want to process thousands of things in 10 threads parallel, it will do.
It works on a way, that it calls the command bash -c
for all lines of input, parallel, but so, that always at most 10 child processes will coincidentally run (this is done by the -P 10
). The xargs
command end only if all of the bash processes were executed.
Parallel tool (see other answers)
The GNU has also written a tool named parallel
as well. As far I know, its syntax is a little bit more clear as of the xargs
s, although it has lesser features and it is not so common. Likely other answers will explain it.
NodeJs "parallel" module
If you are working in the nodejs framework, also there is a very commonly used tool (npm install parallel
), which is used mainly for parallelizing automatized build tasks, but it can be used also in shellscripts easily.
It is not a very good idea in common environments, because npm
packages don't integrate very well with the Unix packet-handling environment. Although its syntax are far more easy, its features are far behind all of the other solutions.
In the case of your specific problem, I would likely choose the second solution in a simple case, and the xargs-based solution in a more complex one.
Best Answer
You can use GNU version of parallel
For example, to call
echo
with the arguments1
through10
, dispatching eachecho
command to one ofserver.example.com
orserver2.example.net
: