How to pipe the stdout of a command, depending on the result of the exit code
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
Expanding from this question, we have a use case where we want to pipe the stdout of a command depending on whether that command succeeded or failed.
We start with a basic pipe
command | grep -P "foo"
However we notice that sometimes command
does not output anything to stdout, but does have an exit code of 0
. We want to ignore this case and only apply the grep
when the exit code is 1
For a working example, we could implement a command like this:
OUTPUT=$(command) # exit code is 0 or 1
RESULT=$?
if [ $RESULT -eq 0 ]; then
return $RESULT; # if the exit code is 0 then we simply pass it forward
else
grep -P "foo" <<< $OUTPUT; # otherwise check if the stdout contains "foo"
fi
but this has a number of disadvantages, namely that you have to write a script, meaning you can't just execute it in the console.
It also seems somewhat amateurish.
For a more concise syntax, I'm imagining a fictional ternary operator which pipes if the exit code is 1
, otherwise it passes the exit code forwards.
command |?1 grep -P "foo" : $?
Is there a series of operators and utils that will achieve this result?
bash control-flow
New contributor
add a comment |Â
up vote
1
down vote
favorite
Expanding from this question, we have a use case where we want to pipe the stdout of a command depending on whether that command succeeded or failed.
We start with a basic pipe
command | grep -P "foo"
However we notice that sometimes command
does not output anything to stdout, but does have an exit code of 0
. We want to ignore this case and only apply the grep
when the exit code is 1
For a working example, we could implement a command like this:
OUTPUT=$(command) # exit code is 0 or 1
RESULT=$?
if [ $RESULT -eq 0 ]; then
return $RESULT; # if the exit code is 0 then we simply pass it forward
else
grep -P "foo" <<< $OUTPUT; # otherwise check if the stdout contains "foo"
fi
but this has a number of disadvantages, namely that you have to write a script, meaning you can't just execute it in the console.
It also seems somewhat amateurish.
For a more concise syntax, I'm imagining a fictional ternary operator which pipes if the exit code is 1
, otherwise it passes the exit code forwards.
command |?1 grep -P "foo" : $?
Is there a series of operators and utils that will achieve this result?
bash control-flow
New contributor
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Expanding from this question, we have a use case where we want to pipe the stdout of a command depending on whether that command succeeded or failed.
We start with a basic pipe
command | grep -P "foo"
However we notice that sometimes command
does not output anything to stdout, but does have an exit code of 0
. We want to ignore this case and only apply the grep
when the exit code is 1
For a working example, we could implement a command like this:
OUTPUT=$(command) # exit code is 0 or 1
RESULT=$?
if [ $RESULT -eq 0 ]; then
return $RESULT; # if the exit code is 0 then we simply pass it forward
else
grep -P "foo" <<< $OUTPUT; # otherwise check if the stdout contains "foo"
fi
but this has a number of disadvantages, namely that you have to write a script, meaning you can't just execute it in the console.
It also seems somewhat amateurish.
For a more concise syntax, I'm imagining a fictional ternary operator which pipes if the exit code is 1
, otherwise it passes the exit code forwards.
command |?1 grep -P "foo" : $?
Is there a series of operators and utils that will achieve this result?
bash control-flow
New contributor
Expanding from this question, we have a use case where we want to pipe the stdout of a command depending on whether that command succeeded or failed.
We start with a basic pipe
command | grep -P "foo"
However we notice that sometimes command
does not output anything to stdout, but does have an exit code of 0
. We want to ignore this case and only apply the grep
when the exit code is 1
For a working example, we could implement a command like this:
OUTPUT=$(command) # exit code is 0 or 1
RESULT=$?
if [ $RESULT -eq 0 ]; then
return $RESULT; # if the exit code is 0 then we simply pass it forward
else
grep -P "foo" <<< $OUTPUT; # otherwise check if the stdout contains "foo"
fi
but this has a number of disadvantages, namely that you have to write a script, meaning you can't just execute it in the console.
It also seems somewhat amateurish.
For a more concise syntax, I'm imagining a fictional ternary operator which pipes if the exit code is 1
, otherwise it passes the exit code forwards.
command |?1 grep -P "foo" : $?
Is there a series of operators and utils that will achieve this result?
bash control-flow
bash control-flow
New contributor
New contributor
New contributor
asked 13 mins ago
I'll Eat My Hat
61
61
New contributor
New contributor
add a comment |Â
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
2
down vote
Commands in a pipeline run concurrently, that's the whole point of pipes, and inter-process communication mechanism.
In:
cmd1 | cmd2
cmd1
and cmd2
are started at the same time, cmd2
processes the data that cmd1
writes as it comes.
If you wanted cmd2
to be started only if cmd1
had failed, you'd have to start cmd2
after cmd1
has finished and reported its exit status, so you couldn't use a pipe, you'd have to use a temporary file that holds all the data the cmd1
has produced:
cmd1 > file || cmd2 < file; rm -f file
Or store in memory like in your example but that has a number of other issues (like $(...)
removing all trailing newline characters, and most shells can't cope with NUL bytes in there, not to mention the scaling issues for large outputs).
On Linux and with shells like zsh
or bash
that store here-documents and here-strings in temporary files, you could do:
cmd1 > /dev/fd/3 3<<< ignored
To let the shell deal with the temp file creation and clean-up.
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
Commands in a pipeline run concurrently, that's the whole point of pipes, and inter-process communication mechanism.
In:
cmd1 | cmd2
cmd1
and cmd2
are started at the same time, cmd2
processes the data that cmd1
writes as it comes.
If you wanted cmd2
to be started only if cmd1
had failed, you'd have to start cmd2
after cmd1
has finished and reported its exit status, so you couldn't use a pipe, you'd have to use a temporary file that holds all the data the cmd1
has produced:
cmd1 > file || cmd2 < file; rm -f file
Or store in memory like in your example but that has a number of other issues (like $(...)
removing all trailing newline characters, and most shells can't cope with NUL bytes in there, not to mention the scaling issues for large outputs).
On Linux and with shells like zsh
or bash
that store here-documents and here-strings in temporary files, you could do:
cmd1 > /dev/fd/3 3<<< ignored
To let the shell deal with the temp file creation and clean-up.
add a comment |Â
up vote
2
down vote
Commands in a pipeline run concurrently, that's the whole point of pipes, and inter-process communication mechanism.
In:
cmd1 | cmd2
cmd1
and cmd2
are started at the same time, cmd2
processes the data that cmd1
writes as it comes.
If you wanted cmd2
to be started only if cmd1
had failed, you'd have to start cmd2
after cmd1
has finished and reported its exit status, so you couldn't use a pipe, you'd have to use a temporary file that holds all the data the cmd1
has produced:
cmd1 > file || cmd2 < file; rm -f file
Or store in memory like in your example but that has a number of other issues (like $(...)
removing all trailing newline characters, and most shells can't cope with NUL bytes in there, not to mention the scaling issues for large outputs).
On Linux and with shells like zsh
or bash
that store here-documents and here-strings in temporary files, you could do:
cmd1 > /dev/fd/3 3<<< ignored
To let the shell deal with the temp file creation and clean-up.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
Commands in a pipeline run concurrently, that's the whole point of pipes, and inter-process communication mechanism.
In:
cmd1 | cmd2
cmd1
and cmd2
are started at the same time, cmd2
processes the data that cmd1
writes as it comes.
If you wanted cmd2
to be started only if cmd1
had failed, you'd have to start cmd2
after cmd1
has finished and reported its exit status, so you couldn't use a pipe, you'd have to use a temporary file that holds all the data the cmd1
has produced:
cmd1 > file || cmd2 < file; rm -f file
Or store in memory like in your example but that has a number of other issues (like $(...)
removing all trailing newline characters, and most shells can't cope with NUL bytes in there, not to mention the scaling issues for large outputs).
On Linux and with shells like zsh
or bash
that store here-documents and here-strings in temporary files, you could do:
cmd1 > /dev/fd/3 3<<< ignored
To let the shell deal with the temp file creation and clean-up.
Commands in a pipeline run concurrently, that's the whole point of pipes, and inter-process communication mechanism.
In:
cmd1 | cmd2
cmd1
and cmd2
are started at the same time, cmd2
processes the data that cmd1
writes as it comes.
If you wanted cmd2
to be started only if cmd1
had failed, you'd have to start cmd2
after cmd1
has finished and reported its exit status, so you couldn't use a pipe, you'd have to use a temporary file that holds all the data the cmd1
has produced:
cmd1 > file || cmd2 < file; rm -f file
Or store in memory like in your example but that has a number of other issues (like $(...)
removing all trailing newline characters, and most shells can't cope with NUL bytes in there, not to mention the scaling issues for large outputs).
On Linux and with shells like zsh
or bash
that store here-documents and here-strings in temporary files, you could do:
cmd1 > /dev/fd/3 3<<< ignored
To let the shell deal with the temp file creation and clean-up.
edited 2 mins ago
answered 8 mins ago
Stéphane Chazelas
289k54537875
289k54537875
add a comment |Â
add a comment |Â
I'll Eat My Hat is a new contributor. Be nice, and check out our Code of Conduct.
I'll Eat My Hat is a new contributor. Be nice, and check out our Code of Conduct.
I'll Eat My Hat is a new contributor. Be nice, and check out our Code of Conduct.
I'll Eat My Hat is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f477820%2fhow-to-pipe-the-stdout-of-a-command-depending-on-the-result-of-the-exit-code%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password