run a script in multiple folders in parallel
Clash Royale CLAN TAG#URR8PPP
up vote
5
down vote
favorite
I have several sub-directories within on high level directory. Each sub-directory has several files and a for loop shell script. The same for loop script is present in each sub-directory. I want to go into each sub-directory and run the for loop script in parallel in several terminals.
I tried this but it seems to do serially (one after another) but I want run all of them in parallel.
find dir_* -type f -execdir sh for_loop.sh ;
bash shell-script files parallelism
add a comment |Â
up vote
5
down vote
favorite
I have several sub-directories within on high level directory. Each sub-directory has several files and a for loop shell script. The same for loop script is present in each sub-directory. I want to go into each sub-directory and run the for loop script in parallel in several terminals.
I tried this but it seems to do serially (one after another) but I want run all of them in parallel.
find dir_* -type f -execdir sh for_loop.sh ;
bash shell-script files parallelism
add a comment |Â
up vote
5
down vote
favorite
up vote
5
down vote
favorite
I have several sub-directories within on high level directory. Each sub-directory has several files and a for loop shell script. The same for loop script is present in each sub-directory. I want to go into each sub-directory and run the for loop script in parallel in several terminals.
I tried this but it seems to do serially (one after another) but I want run all of them in parallel.
find dir_* -type f -execdir sh for_loop.sh ;
bash shell-script files parallelism
I have several sub-directories within on high level directory. Each sub-directory has several files and a for loop shell script. The same for loop script is present in each sub-directory. I want to go into each sub-directory and run the for loop script in parallel in several terminals.
I tried this but it seems to do serially (one after another) but I want run all of them in parallel.
find dir_* -type f -execdir sh for_loop.sh ;
bash shell-script files parallelism
bash shell-script files parallelism
edited May 29 '17 at 23:03
Gilles
512k12010151547
512k12010151547
asked May 29 '17 at 7:07
user233520
262
262
add a comment |Â
add a comment |Â
5 Answers
5
active
oldest
votes
up vote
4
down vote
Probably the perfect tool for this is GNU Parallel:
parallel ::: dir_*/for_loop.sh
GNU Parallel not only runs each job in parallel, but also it demultiplexes their output so they won't interfere with each other.
From its man page:
GNU parallel is a shell tool for executing jobs in parallel using one or more computers. A job can be a single command or a small script that has to be run for each of the lines in the input. The typical input is a list of files, a list of hosts, a list of users, a list of URLs, or a list of tables. A job can also be a command that reads from a pipe. GNU parallel can then split the input into blocks and pipe a block into each command in parallel.
If you use xargs and tee today you will find GNU parallel very easy to use as GNU parallel is written to have the same options as xargs. If you write loops in shell, you will find GNU parallel may be able to replace most of the loops and make them run faster by running several jobs in parallel.
GNU parallel makes sure output from the commands is the same output as you would get had you run the commands sequentially. This makes it possible to use output from GNU parallel as input for other programs.
add a comment |Â
up vote
3
down vote
find
won't do that for you.
create a skript, locate your for_loop.sh scripts and execute them, like so:
#!/bin/bash
for theScript in $(find dir_* -name for_loop.sh); do
"$theScript" &
done
if the script has to be run inside the sub-dir, try to cd
into before, maybe like cd $(dirname "$theScript") && . $(basename "$theScript")
.
my examples are not tested in detail and not error-tolerant ...
Edit 1:
As Sato Katsura commented correctly, the script above breaks if there are spaces in the directory name.
So I changed to loop to read
:
#!/bin/bash
find dir_* -name for_loop.sh | while IFS= read -r theScript; do
"$theScript" &
done
This breaks if you have spaces in the names of a directories.
â Satà  Katsura
May 29 '17 at 8:01
It would still break on spaces. You needIFS=
in the while read. And filenames with newlines are still not handled properly.
â user218374
May 29 '17 at 9:56
You are right, but please notice the answer points to the concrete question to execute the scriptfor_loop.sh
inside each directory in parrallel, not to handle each file. The file handling inside the directories is up to the corresponding script. But thanks for editing anyway.
â ChristophS
May 29 '17 at 10:12
I vote up for this answer because it does not require anything and is more complete than mine
â M4rty
May 29 '17 at 20:22
add a comment |Â
up vote
2
down vote
You should be passing on find
's output to xargs
, running in parallel mode:
find dir_*/ -type f -name for_loop.sh -print0 | xargs -0 -r -n 1 -P 3 -t sh
We are asking find
here to find all files with names of for_loop.sh recursively under the directories beginning with the names dir_ and pass them on to xargs, a file at a time, in parallel mode of running no more than 3 processes at any given time.
Use is made of the null delimiter in printing filenames by
find
and splitting them on nulls by xargs
.
add a comment |Â
up vote
2
down vote
Assuming this does the right thing - only in serial:
find dir_* -type f -execdir sh for_loop.sh ;
Then you should be able to replace that with:
find dir_* -type f | parallel 'cd // && sh for_loop.sh '
To run it in multiple terminals GNU Parallel supports tmux
to run each command in its own tmux
pane:
find dir_* -type f | parallel --tmuxpane 'cd // && sh for_loop.sh '
It defaults to one job per CPU core. In your case you might want to run one more job than you have cores:
find dir_* -type f | parallel -j+1 --tmuxpane 'cd // && sh for_loop.sh '
GNU Parallel is a general parallelizer and makes is easy to run jobs in parallel on the same machine or on multiple machines you have ssh access to.
If you have 32 different jobs you want to run on 4 CPUs, a straight forward way to parallelize is to run 8 jobs on each CPU:
GNU Parallel instead spawns a new process when one finishes - keeping the CPUs active and thus saving time:
Installation
For security reasons you should install GNU Parallel with your package manager, but if GNU Parallel is not packaged for your distribution, you can do a personal installation, which does not require root access. It can be done in 10 seconds by doing this:
(wget -O - pi.dk/3 || curl pi.dk/3/ || fetch -o - http://pi.dk/3) | bash
For other installation options see http://git.savannah.gnu.org/cgit/parallel.git/tree/README
Learn more
See more examples: http://www.gnu.org/software/parallel/man.html
Watch the intro videos: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Walk through the tutorial: http://www.gnu.org/software/parallel/parallel_tutorial.html
Sign up for the email list to get support: https://lists.gnu.org/mailman/listinfo/parallel
Off-topic: how did you produce the graphics?
â Nikos Alexandris
Aug 17 '17 at 18:23
LibreOffice Draw.
â Ole Tange
Aug 18 '17 at 5:53
add a comment |Â
up vote
0
down vote
you can do from your top level directory
for D in `find . -type d -maxdepth 1`
do
$D/<yourScriptName>.sh &
done
the "&" is to run them in the background
1
Like the similar answer by ChristophS, this breaks if the output fromfind
contains whitespace or shell wildcard characters which end up not matching themselves and only themselves.
â tripleee
May 29 '17 at 13:06
add a comment |Â
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
4
down vote
Probably the perfect tool for this is GNU Parallel:
parallel ::: dir_*/for_loop.sh
GNU Parallel not only runs each job in parallel, but also it demultiplexes their output so they won't interfere with each other.
From its man page:
GNU parallel is a shell tool for executing jobs in parallel using one or more computers. A job can be a single command or a small script that has to be run for each of the lines in the input. The typical input is a list of files, a list of hosts, a list of users, a list of URLs, or a list of tables. A job can also be a command that reads from a pipe. GNU parallel can then split the input into blocks and pipe a block into each command in parallel.
If you use xargs and tee today you will find GNU parallel very easy to use as GNU parallel is written to have the same options as xargs. If you write loops in shell, you will find GNU parallel may be able to replace most of the loops and make them run faster by running several jobs in parallel.
GNU parallel makes sure output from the commands is the same output as you would get had you run the commands sequentially. This makes it possible to use output from GNU parallel as input for other programs.
add a comment |Â
up vote
4
down vote
Probably the perfect tool for this is GNU Parallel:
parallel ::: dir_*/for_loop.sh
GNU Parallel not only runs each job in parallel, but also it demultiplexes their output so they won't interfere with each other.
From its man page:
GNU parallel is a shell tool for executing jobs in parallel using one or more computers. A job can be a single command or a small script that has to be run for each of the lines in the input. The typical input is a list of files, a list of hosts, a list of users, a list of URLs, or a list of tables. A job can also be a command that reads from a pipe. GNU parallel can then split the input into blocks and pipe a block into each command in parallel.
If you use xargs and tee today you will find GNU parallel very easy to use as GNU parallel is written to have the same options as xargs. If you write loops in shell, you will find GNU parallel may be able to replace most of the loops and make them run faster by running several jobs in parallel.
GNU parallel makes sure output from the commands is the same output as you would get had you run the commands sequentially. This makes it possible to use output from GNU parallel as input for other programs.
add a comment |Â
up vote
4
down vote
up vote
4
down vote
Probably the perfect tool for this is GNU Parallel:
parallel ::: dir_*/for_loop.sh
GNU Parallel not only runs each job in parallel, but also it demultiplexes their output so they won't interfere with each other.
From its man page:
GNU parallel is a shell tool for executing jobs in parallel using one or more computers. A job can be a single command or a small script that has to be run for each of the lines in the input. The typical input is a list of files, a list of hosts, a list of users, a list of URLs, or a list of tables. A job can also be a command that reads from a pipe. GNU parallel can then split the input into blocks and pipe a block into each command in parallel.
If you use xargs and tee today you will find GNU parallel very easy to use as GNU parallel is written to have the same options as xargs. If you write loops in shell, you will find GNU parallel may be able to replace most of the loops and make them run faster by running several jobs in parallel.
GNU parallel makes sure output from the commands is the same output as you would get had you run the commands sequentially. This makes it possible to use output from GNU parallel as input for other programs.
Probably the perfect tool for this is GNU Parallel:
parallel ::: dir_*/for_loop.sh
GNU Parallel not only runs each job in parallel, but also it demultiplexes their output so they won't interfere with each other.
From its man page:
GNU parallel is a shell tool for executing jobs in parallel using one or more computers. A job can be a single command or a small script that has to be run for each of the lines in the input. The typical input is a list of files, a list of hosts, a list of users, a list of URLs, or a list of tables. A job can also be a command that reads from a pipe. GNU parallel can then split the input into blocks and pipe a block into each command in parallel.
If you use xargs and tee today you will find GNU parallel very easy to use as GNU parallel is written to have the same options as xargs. If you write loops in shell, you will find GNU parallel may be able to replace most of the loops and make them run faster by running several jobs in parallel.
GNU parallel makes sure output from the commands is the same output as you would get had you run the commands sequentially. This makes it possible to use output from GNU parallel as input for other programs.
edited Oct 1 at 13:28
Ole Tange
11.6k1446103
11.6k1446103
answered May 29 '17 at 8:47
dr01
15.4k114869
15.4k114869
add a comment |Â
add a comment |Â
up vote
3
down vote
find
won't do that for you.
create a skript, locate your for_loop.sh scripts and execute them, like so:
#!/bin/bash
for theScript in $(find dir_* -name for_loop.sh); do
"$theScript" &
done
if the script has to be run inside the sub-dir, try to cd
into before, maybe like cd $(dirname "$theScript") && . $(basename "$theScript")
.
my examples are not tested in detail and not error-tolerant ...
Edit 1:
As Sato Katsura commented correctly, the script above breaks if there are spaces in the directory name.
So I changed to loop to read
:
#!/bin/bash
find dir_* -name for_loop.sh | while IFS= read -r theScript; do
"$theScript" &
done
This breaks if you have spaces in the names of a directories.
â Satà  Katsura
May 29 '17 at 8:01
It would still break on spaces. You needIFS=
in the while read. And filenames with newlines are still not handled properly.
â user218374
May 29 '17 at 9:56
You are right, but please notice the answer points to the concrete question to execute the scriptfor_loop.sh
inside each directory in parrallel, not to handle each file. The file handling inside the directories is up to the corresponding script. But thanks for editing anyway.
â ChristophS
May 29 '17 at 10:12
I vote up for this answer because it does not require anything and is more complete than mine
â M4rty
May 29 '17 at 20:22
add a comment |Â
up vote
3
down vote
find
won't do that for you.
create a skript, locate your for_loop.sh scripts and execute them, like so:
#!/bin/bash
for theScript in $(find dir_* -name for_loop.sh); do
"$theScript" &
done
if the script has to be run inside the sub-dir, try to cd
into before, maybe like cd $(dirname "$theScript") && . $(basename "$theScript")
.
my examples are not tested in detail and not error-tolerant ...
Edit 1:
As Sato Katsura commented correctly, the script above breaks if there are spaces in the directory name.
So I changed to loop to read
:
#!/bin/bash
find dir_* -name for_loop.sh | while IFS= read -r theScript; do
"$theScript" &
done
This breaks if you have spaces in the names of a directories.
â Satà  Katsura
May 29 '17 at 8:01
It would still break on spaces. You needIFS=
in the while read. And filenames with newlines are still not handled properly.
â user218374
May 29 '17 at 9:56
You are right, but please notice the answer points to the concrete question to execute the scriptfor_loop.sh
inside each directory in parrallel, not to handle each file. The file handling inside the directories is up to the corresponding script. But thanks for editing anyway.
â ChristophS
May 29 '17 at 10:12
I vote up for this answer because it does not require anything and is more complete than mine
â M4rty
May 29 '17 at 20:22
add a comment |Â
up vote
3
down vote
up vote
3
down vote
find
won't do that for you.
create a skript, locate your for_loop.sh scripts and execute them, like so:
#!/bin/bash
for theScript in $(find dir_* -name for_loop.sh); do
"$theScript" &
done
if the script has to be run inside the sub-dir, try to cd
into before, maybe like cd $(dirname "$theScript") && . $(basename "$theScript")
.
my examples are not tested in detail and not error-tolerant ...
Edit 1:
As Sato Katsura commented correctly, the script above breaks if there are spaces in the directory name.
So I changed to loop to read
:
#!/bin/bash
find dir_* -name for_loop.sh | while IFS= read -r theScript; do
"$theScript" &
done
find
won't do that for you.
create a skript, locate your for_loop.sh scripts and execute them, like so:
#!/bin/bash
for theScript in $(find dir_* -name for_loop.sh); do
"$theScript" &
done
if the script has to be run inside the sub-dir, try to cd
into before, maybe like cd $(dirname "$theScript") && . $(basename "$theScript")
.
my examples are not tested in detail and not error-tolerant ...
Edit 1:
As Sato Katsura commented correctly, the script above breaks if there are spaces in the directory name.
So I changed to loop to read
:
#!/bin/bash
find dir_* -name for_loop.sh | while IFS= read -r theScript; do
"$theScript" &
done
edited May 29 '17 at 9:57
user218374
answered May 29 '17 at 7:25
ChristophS
39329
39329
This breaks if you have spaces in the names of a directories.
â Satà  Katsura
May 29 '17 at 8:01
It would still break on spaces. You needIFS=
in the while read. And filenames with newlines are still not handled properly.
â user218374
May 29 '17 at 9:56
You are right, but please notice the answer points to the concrete question to execute the scriptfor_loop.sh
inside each directory in parrallel, not to handle each file. The file handling inside the directories is up to the corresponding script. But thanks for editing anyway.
â ChristophS
May 29 '17 at 10:12
I vote up for this answer because it does not require anything and is more complete than mine
â M4rty
May 29 '17 at 20:22
add a comment |Â
This breaks if you have spaces in the names of a directories.
â Satà  Katsura
May 29 '17 at 8:01
It would still break on spaces. You needIFS=
in the while read. And filenames with newlines are still not handled properly.
â user218374
May 29 '17 at 9:56
You are right, but please notice the answer points to the concrete question to execute the scriptfor_loop.sh
inside each directory in parrallel, not to handle each file. The file handling inside the directories is up to the corresponding script. But thanks for editing anyway.
â ChristophS
May 29 '17 at 10:12
I vote up for this answer because it does not require anything and is more complete than mine
â M4rty
May 29 '17 at 20:22
This breaks if you have spaces in the names of a directories.
â Satà  Katsura
May 29 '17 at 8:01
This breaks if you have spaces in the names of a directories.
â Satà  Katsura
May 29 '17 at 8:01
It would still break on spaces. You need
IFS=
in the while read. And filenames with newlines are still not handled properly.â user218374
May 29 '17 at 9:56
It would still break on spaces. You need
IFS=
in the while read. And filenames with newlines are still not handled properly.â user218374
May 29 '17 at 9:56
You are right, but please notice the answer points to the concrete question to execute the script
for_loop.sh
inside each directory in parrallel, not to handle each file. The file handling inside the directories is up to the corresponding script. But thanks for editing anyway.â ChristophS
May 29 '17 at 10:12
You are right, but please notice the answer points to the concrete question to execute the script
for_loop.sh
inside each directory in parrallel, not to handle each file. The file handling inside the directories is up to the corresponding script. But thanks for editing anyway.â ChristophS
May 29 '17 at 10:12
I vote up for this answer because it does not require anything and is more complete than mine
â M4rty
May 29 '17 at 20:22
I vote up for this answer because it does not require anything and is more complete than mine
â M4rty
May 29 '17 at 20:22
add a comment |Â
up vote
2
down vote
You should be passing on find
's output to xargs
, running in parallel mode:
find dir_*/ -type f -name for_loop.sh -print0 | xargs -0 -r -n 1 -P 3 -t sh
We are asking find
here to find all files with names of for_loop.sh recursively under the directories beginning with the names dir_ and pass them on to xargs, a file at a time, in parallel mode of running no more than 3 processes at any given time.
Use is made of the null delimiter in printing filenames by
find
and splitting them on nulls by xargs
.
add a comment |Â
up vote
2
down vote
You should be passing on find
's output to xargs
, running in parallel mode:
find dir_*/ -type f -name for_loop.sh -print0 | xargs -0 -r -n 1 -P 3 -t sh
We are asking find
here to find all files with names of for_loop.sh recursively under the directories beginning with the names dir_ and pass them on to xargs, a file at a time, in parallel mode of running no more than 3 processes at any given time.
Use is made of the null delimiter in printing filenames by
find
and splitting them on nulls by xargs
.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
You should be passing on find
's output to xargs
, running in parallel mode:
find dir_*/ -type f -name for_loop.sh -print0 | xargs -0 -r -n 1 -P 3 -t sh
We are asking find
here to find all files with names of for_loop.sh recursively under the directories beginning with the names dir_ and pass them on to xargs, a file at a time, in parallel mode of running no more than 3 processes at any given time.
Use is made of the null delimiter in printing filenames by
find
and splitting them on nulls by xargs
.
You should be passing on find
's output to xargs
, running in parallel mode:
find dir_*/ -type f -name for_loop.sh -print0 | xargs -0 -r -n 1 -P 3 -t sh
We are asking find
here to find all files with names of for_loop.sh recursively under the directories beginning with the names dir_ and pass them on to xargs, a file at a time, in parallel mode of running no more than 3 processes at any given time.
Use is made of the null delimiter in printing filenames by
find
and splitting them on nulls by xargs
.
answered May 29 '17 at 8:07
user218374
add a comment |Â
add a comment |Â
up vote
2
down vote
Assuming this does the right thing - only in serial:
find dir_* -type f -execdir sh for_loop.sh ;
Then you should be able to replace that with:
find dir_* -type f | parallel 'cd // && sh for_loop.sh '
To run it in multiple terminals GNU Parallel supports tmux
to run each command in its own tmux
pane:
find dir_* -type f | parallel --tmuxpane 'cd // && sh for_loop.sh '
It defaults to one job per CPU core. In your case you might want to run one more job than you have cores:
find dir_* -type f | parallel -j+1 --tmuxpane 'cd // && sh for_loop.sh '
GNU Parallel is a general parallelizer and makes is easy to run jobs in parallel on the same machine or on multiple machines you have ssh access to.
If you have 32 different jobs you want to run on 4 CPUs, a straight forward way to parallelize is to run 8 jobs on each CPU:
GNU Parallel instead spawns a new process when one finishes - keeping the CPUs active and thus saving time:
Installation
For security reasons you should install GNU Parallel with your package manager, but if GNU Parallel is not packaged for your distribution, you can do a personal installation, which does not require root access. It can be done in 10 seconds by doing this:
(wget -O - pi.dk/3 || curl pi.dk/3/ || fetch -o - http://pi.dk/3) | bash
For other installation options see http://git.savannah.gnu.org/cgit/parallel.git/tree/README
Learn more
See more examples: http://www.gnu.org/software/parallel/man.html
Watch the intro videos: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Walk through the tutorial: http://www.gnu.org/software/parallel/parallel_tutorial.html
Sign up for the email list to get support: https://lists.gnu.org/mailman/listinfo/parallel
Off-topic: how did you produce the graphics?
â Nikos Alexandris
Aug 17 '17 at 18:23
LibreOffice Draw.
â Ole Tange
Aug 18 '17 at 5:53
add a comment |Â
up vote
2
down vote
Assuming this does the right thing - only in serial:
find dir_* -type f -execdir sh for_loop.sh ;
Then you should be able to replace that with:
find dir_* -type f | parallel 'cd // && sh for_loop.sh '
To run it in multiple terminals GNU Parallel supports tmux
to run each command in its own tmux
pane:
find dir_* -type f | parallel --tmuxpane 'cd // && sh for_loop.sh '
It defaults to one job per CPU core. In your case you might want to run one more job than you have cores:
find dir_* -type f | parallel -j+1 --tmuxpane 'cd // && sh for_loop.sh '
GNU Parallel is a general parallelizer and makes is easy to run jobs in parallel on the same machine or on multiple machines you have ssh access to.
If you have 32 different jobs you want to run on 4 CPUs, a straight forward way to parallelize is to run 8 jobs on each CPU:
GNU Parallel instead spawns a new process when one finishes - keeping the CPUs active and thus saving time:
Installation
For security reasons you should install GNU Parallel with your package manager, but if GNU Parallel is not packaged for your distribution, you can do a personal installation, which does not require root access. It can be done in 10 seconds by doing this:
(wget -O - pi.dk/3 || curl pi.dk/3/ || fetch -o - http://pi.dk/3) | bash
For other installation options see http://git.savannah.gnu.org/cgit/parallel.git/tree/README
Learn more
See more examples: http://www.gnu.org/software/parallel/man.html
Watch the intro videos: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Walk through the tutorial: http://www.gnu.org/software/parallel/parallel_tutorial.html
Sign up for the email list to get support: https://lists.gnu.org/mailman/listinfo/parallel
Off-topic: how did you produce the graphics?
â Nikos Alexandris
Aug 17 '17 at 18:23
LibreOffice Draw.
â Ole Tange
Aug 18 '17 at 5:53
add a comment |Â
up vote
2
down vote
up vote
2
down vote
Assuming this does the right thing - only in serial:
find dir_* -type f -execdir sh for_loop.sh ;
Then you should be able to replace that with:
find dir_* -type f | parallel 'cd // && sh for_loop.sh '
To run it in multiple terminals GNU Parallel supports tmux
to run each command in its own tmux
pane:
find dir_* -type f | parallel --tmuxpane 'cd // && sh for_loop.sh '
It defaults to one job per CPU core. In your case you might want to run one more job than you have cores:
find dir_* -type f | parallel -j+1 --tmuxpane 'cd // && sh for_loop.sh '
GNU Parallel is a general parallelizer and makes is easy to run jobs in parallel on the same machine or on multiple machines you have ssh access to.
If you have 32 different jobs you want to run on 4 CPUs, a straight forward way to parallelize is to run 8 jobs on each CPU:
GNU Parallel instead spawns a new process when one finishes - keeping the CPUs active and thus saving time:
Installation
For security reasons you should install GNU Parallel with your package manager, but if GNU Parallel is not packaged for your distribution, you can do a personal installation, which does not require root access. It can be done in 10 seconds by doing this:
(wget -O - pi.dk/3 || curl pi.dk/3/ || fetch -o - http://pi.dk/3) | bash
For other installation options see http://git.savannah.gnu.org/cgit/parallel.git/tree/README
Learn more
See more examples: http://www.gnu.org/software/parallel/man.html
Watch the intro videos: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Walk through the tutorial: http://www.gnu.org/software/parallel/parallel_tutorial.html
Sign up for the email list to get support: https://lists.gnu.org/mailman/listinfo/parallel
Assuming this does the right thing - only in serial:
find dir_* -type f -execdir sh for_loop.sh ;
Then you should be able to replace that with:
find dir_* -type f | parallel 'cd // && sh for_loop.sh '
To run it in multiple terminals GNU Parallel supports tmux
to run each command in its own tmux
pane:
find dir_* -type f | parallel --tmuxpane 'cd // && sh for_loop.sh '
It defaults to one job per CPU core. In your case you might want to run one more job than you have cores:
find dir_* -type f | parallel -j+1 --tmuxpane 'cd // && sh for_loop.sh '
GNU Parallel is a general parallelizer and makes is easy to run jobs in parallel on the same machine or on multiple machines you have ssh access to.
If you have 32 different jobs you want to run on 4 CPUs, a straight forward way to parallelize is to run 8 jobs on each CPU:
GNU Parallel instead spawns a new process when one finishes - keeping the CPUs active and thus saving time:
Installation
For security reasons you should install GNU Parallel with your package manager, but if GNU Parallel is not packaged for your distribution, you can do a personal installation, which does not require root access. It can be done in 10 seconds by doing this:
(wget -O - pi.dk/3 || curl pi.dk/3/ || fetch -o - http://pi.dk/3) | bash
For other installation options see http://git.savannah.gnu.org/cgit/parallel.git/tree/README
Learn more
See more examples: http://www.gnu.org/software/parallel/man.html
Watch the intro videos: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Walk through the tutorial: http://www.gnu.org/software/parallel/parallel_tutorial.html
Sign up for the email list to get support: https://lists.gnu.org/mailman/listinfo/parallel
answered May 29 '17 at 18:21
Ole Tange
11.6k1446103
11.6k1446103
Off-topic: how did you produce the graphics?
â Nikos Alexandris
Aug 17 '17 at 18:23
LibreOffice Draw.
â Ole Tange
Aug 18 '17 at 5:53
add a comment |Â
Off-topic: how did you produce the graphics?
â Nikos Alexandris
Aug 17 '17 at 18:23
LibreOffice Draw.
â Ole Tange
Aug 18 '17 at 5:53
Off-topic: how did you produce the graphics?
â Nikos Alexandris
Aug 17 '17 at 18:23
Off-topic: how did you produce the graphics?
â Nikos Alexandris
Aug 17 '17 at 18:23
LibreOffice Draw.
â Ole Tange
Aug 18 '17 at 5:53
LibreOffice Draw.
â Ole Tange
Aug 18 '17 at 5:53
add a comment |Â
up vote
0
down vote
you can do from your top level directory
for D in `find . -type d -maxdepth 1`
do
$D/<yourScriptName>.sh &
done
the "&" is to run them in the background
1
Like the similar answer by ChristophS, this breaks if the output fromfind
contains whitespace or shell wildcard characters which end up not matching themselves and only themselves.
â tripleee
May 29 '17 at 13:06
add a comment |Â
up vote
0
down vote
you can do from your top level directory
for D in `find . -type d -maxdepth 1`
do
$D/<yourScriptName>.sh &
done
the "&" is to run them in the background
1
Like the similar answer by ChristophS, this breaks if the output fromfind
contains whitespace or shell wildcard characters which end up not matching themselves and only themselves.
â tripleee
May 29 '17 at 13:06
add a comment |Â
up vote
0
down vote
up vote
0
down vote
you can do from your top level directory
for D in `find . -type d -maxdepth 1`
do
$D/<yourScriptName>.sh &
done
the "&" is to run them in the background
you can do from your top level directory
for D in `find . -type d -maxdepth 1`
do
$D/<yourScriptName>.sh &
done
the "&" is to run them in the background
answered May 29 '17 at 7:26
M4rty
835212
835212
1
Like the similar answer by ChristophS, this breaks if the output fromfind
contains whitespace or shell wildcard characters which end up not matching themselves and only themselves.
â tripleee
May 29 '17 at 13:06
add a comment |Â
1
Like the similar answer by ChristophS, this breaks if the output fromfind
contains whitespace or shell wildcard characters which end up not matching themselves and only themselves.
â tripleee
May 29 '17 at 13:06
1
1
Like the similar answer by ChristophS, this breaks if the output from
find
contains whitespace or shell wildcard characters which end up not matching themselves and only themselves.â tripleee
May 29 '17 at 13:06
Like the similar answer by ChristophS, this breaks if the output from
find
contains whitespace or shell wildcard characters which end up not matching themselves and only themselves.â tripleee
May 29 '17 at 13:06
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f367838%2frun-a-script-in-multiple-folders-in-parallel%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password