Pump command output as function argument
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
I have this extremely simple function in my script:
# Used for debug tracing.
log()
:
echo "log: $1"
The idea is to be able to customize/turn off logging at a single place. Very crude.
Now I want my script to produce absolutely no output when in release configuration. The only solution I have thought of, but extremely unDRY:
TMPFILE='/tmp/tempfilewithpossiblyuniquename'
cmd 1>"$TMPFILE" 2>"$TMPFILE"
cat "$TMPFILE" | xargs log
rm "$TMPFILE"
for every single command. How to improve on this?
EDIT: I want to collect all output to stdout
and stderr
and channel it through log()
. Then log()
can choose to disregard everything, to log to a file, to print etc.
linux bash logs function output
add a comment |Â
up vote
1
down vote
favorite
I have this extremely simple function in my script:
# Used for debug tracing.
log()
:
echo "log: $1"
The idea is to be able to customize/turn off logging at a single place. Very crude.
Now I want my script to produce absolutely no output when in release configuration. The only solution I have thought of, but extremely unDRY:
TMPFILE='/tmp/tempfilewithpossiblyuniquename'
cmd 1>"$TMPFILE" 2>"$TMPFILE"
cat "$TMPFILE" | xargs log
rm "$TMPFILE"
for every single command. How to improve on this?
EDIT: I want to collect all output to stdout
and stderr
and channel it through log()
. Then log()
can choose to disregard everything, to log to a file, to print etc.
linux bash logs function output
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I have this extremely simple function in my script:
# Used for debug tracing.
log()
:
echo "log: $1"
The idea is to be able to customize/turn off logging at a single place. Very crude.
Now I want my script to produce absolutely no output when in release configuration. The only solution I have thought of, but extremely unDRY:
TMPFILE='/tmp/tempfilewithpossiblyuniquename'
cmd 1>"$TMPFILE" 2>"$TMPFILE"
cat "$TMPFILE" | xargs log
rm "$TMPFILE"
for every single command. How to improve on this?
EDIT: I want to collect all output to stdout
and stderr
and channel it through log()
. Then log()
can choose to disregard everything, to log to a file, to print etc.
linux bash logs function output
I have this extremely simple function in my script:
# Used for debug tracing.
log()
:
echo "log: $1"
The idea is to be able to customize/turn off logging at a single place. Very crude.
Now I want my script to produce absolutely no output when in release configuration. The only solution I have thought of, but extremely unDRY:
TMPFILE='/tmp/tempfilewithpossiblyuniquename'
cmd 1>"$TMPFILE" 2>"$TMPFILE"
cat "$TMPFILE" | xargs log
rm "$TMPFILE"
for every single command. How to improve on this?
EDIT: I want to collect all output to stdout
and stderr
and channel it through log()
. Then log()
can choose to disregard everything, to log to a file, to print etc.
linux bash logs function output
edited May 29 at 16:53
asked May 29 at 15:53
Vorac
94121732
94121732
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
0
down vote
Firstly, that function will only log the first "word" of anything sent to it, since you use $1
rather than "$*"
.
Secondly, there are (as is oft the case with POSIX) myriad ways to do this sort of thing. I would probably go with something like:
log()
cat - >> "$logfile"
do_stuff | log
But you could also:
(
do_stuff
do_more_stuff
) >> "$logfile"
As for completely suppressing all output -- that is something that is usually best left for the invoking environment (e. g. ./thing 1> /dev/null 2> &1
) rather than locking it down "in code" as it were. That said:
squashout="true" # comment this out to stop killing output
if ! [[ "true" = "$squashout-false" ]]; then
# Redirect stdout and stderr to the null device.
exec 1> /dev/null
exec 2> /dev/null
fi
add a comment |Â
up vote
0
down vote
Absolutely no output is simple, just redirect the script's stdout
and stderr
to /dev/null
:
exec >/dev/null 2>&1
That will affect the shell itself, and any commands it executes afterwards. Run that conditionally to choose where the output is redirected.
if [ "$output_to_file" = 1 ]; then
exec > "$outputfilename" 2>&1
elif [ "$output_suppress" = 1 ]; then
exec > /dev/null 2>&1
fi
Note, that suppressing all output isn't probably a good idea. The user is very likely to want some notification for errors.
If you insist on passing output through the function (and are running Bash/ksh/Zsh), you could use process substitution:
#!/bin/bash
mangle_output()
# do something smarter here
while read -r line; do
echo "output: $line";
done;
# redirect stdout and stderr to the function
exec > >(mangle_output) 2>&1
echo something that produces output
Though note that processing the output with a shell loop isn't a very good idea, in the least it's slow. See: Why is using a shell loop to process text considered bad practice?. If all you want is redirection a file, or to /dev/null
, just use exec
to set up redirections.
Sorry for the ambiguous question. What I wanted to say, it that all output should be routed throughlog()
.
â Vorac
May 29 at 16:33
@Vorac, mmhm. edited.
â ilkkachu
May 29 at 16:44
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
Firstly, that function will only log the first "word" of anything sent to it, since you use $1
rather than "$*"
.
Secondly, there are (as is oft the case with POSIX) myriad ways to do this sort of thing. I would probably go with something like:
log()
cat - >> "$logfile"
do_stuff | log
But you could also:
(
do_stuff
do_more_stuff
) >> "$logfile"
As for completely suppressing all output -- that is something that is usually best left for the invoking environment (e. g. ./thing 1> /dev/null 2> &1
) rather than locking it down "in code" as it were. That said:
squashout="true" # comment this out to stop killing output
if ! [[ "true" = "$squashout-false" ]]; then
# Redirect stdout and stderr to the null device.
exec 1> /dev/null
exec 2> /dev/null
fi
add a comment |Â
up vote
0
down vote
Firstly, that function will only log the first "word" of anything sent to it, since you use $1
rather than "$*"
.
Secondly, there are (as is oft the case with POSIX) myriad ways to do this sort of thing. I would probably go with something like:
log()
cat - >> "$logfile"
do_stuff | log
But you could also:
(
do_stuff
do_more_stuff
) >> "$logfile"
As for completely suppressing all output -- that is something that is usually best left for the invoking environment (e. g. ./thing 1> /dev/null 2> &1
) rather than locking it down "in code" as it were. That said:
squashout="true" # comment this out to stop killing output
if ! [[ "true" = "$squashout-false" ]]; then
# Redirect stdout and stderr to the null device.
exec 1> /dev/null
exec 2> /dev/null
fi
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Firstly, that function will only log the first "word" of anything sent to it, since you use $1
rather than "$*"
.
Secondly, there are (as is oft the case with POSIX) myriad ways to do this sort of thing. I would probably go with something like:
log()
cat - >> "$logfile"
do_stuff | log
But you could also:
(
do_stuff
do_more_stuff
) >> "$logfile"
As for completely suppressing all output -- that is something that is usually best left for the invoking environment (e. g. ./thing 1> /dev/null 2> &1
) rather than locking it down "in code" as it were. That said:
squashout="true" # comment this out to stop killing output
if ! [[ "true" = "$squashout-false" ]]; then
# Redirect stdout and stderr to the null device.
exec 1> /dev/null
exec 2> /dev/null
fi
Firstly, that function will only log the first "word" of anything sent to it, since you use $1
rather than "$*"
.
Secondly, there are (as is oft the case with POSIX) myriad ways to do this sort of thing. I would probably go with something like:
log()
cat - >> "$logfile"
do_stuff | log
But you could also:
(
do_stuff
do_more_stuff
) >> "$logfile"
As for completely suppressing all output -- that is something that is usually best left for the invoking environment (e. g. ./thing 1> /dev/null 2> &1
) rather than locking it down "in code" as it were. That said:
squashout="true" # comment this out to stop killing output
if ! [[ "true" = "$squashout-false" ]]; then
# Redirect stdout and stderr to the null device.
exec 1> /dev/null
exec 2> /dev/null
fi
edited May 29 at 16:07
answered May 29 at 16:02
DopeGhoti
39.9k54779
39.9k54779
add a comment |Â
add a comment |Â
up vote
0
down vote
Absolutely no output is simple, just redirect the script's stdout
and stderr
to /dev/null
:
exec >/dev/null 2>&1
That will affect the shell itself, and any commands it executes afterwards. Run that conditionally to choose where the output is redirected.
if [ "$output_to_file" = 1 ]; then
exec > "$outputfilename" 2>&1
elif [ "$output_suppress" = 1 ]; then
exec > /dev/null 2>&1
fi
Note, that suppressing all output isn't probably a good idea. The user is very likely to want some notification for errors.
If you insist on passing output through the function (and are running Bash/ksh/Zsh), you could use process substitution:
#!/bin/bash
mangle_output()
# do something smarter here
while read -r line; do
echo "output: $line";
done;
# redirect stdout and stderr to the function
exec > >(mangle_output) 2>&1
echo something that produces output
Though note that processing the output with a shell loop isn't a very good idea, in the least it's slow. See: Why is using a shell loop to process text considered bad practice?. If all you want is redirection a file, or to /dev/null
, just use exec
to set up redirections.
Sorry for the ambiguous question. What I wanted to say, it that all output should be routed throughlog()
.
â Vorac
May 29 at 16:33
@Vorac, mmhm. edited.
â ilkkachu
May 29 at 16:44
add a comment |Â
up vote
0
down vote
Absolutely no output is simple, just redirect the script's stdout
and stderr
to /dev/null
:
exec >/dev/null 2>&1
That will affect the shell itself, and any commands it executes afterwards. Run that conditionally to choose where the output is redirected.
if [ "$output_to_file" = 1 ]; then
exec > "$outputfilename" 2>&1
elif [ "$output_suppress" = 1 ]; then
exec > /dev/null 2>&1
fi
Note, that suppressing all output isn't probably a good idea. The user is very likely to want some notification for errors.
If you insist on passing output through the function (and are running Bash/ksh/Zsh), you could use process substitution:
#!/bin/bash
mangle_output()
# do something smarter here
while read -r line; do
echo "output: $line";
done;
# redirect stdout and stderr to the function
exec > >(mangle_output) 2>&1
echo something that produces output
Though note that processing the output with a shell loop isn't a very good idea, in the least it's slow. See: Why is using a shell loop to process text considered bad practice?. If all you want is redirection a file, or to /dev/null
, just use exec
to set up redirections.
Sorry for the ambiguous question. What I wanted to say, it that all output should be routed throughlog()
.
â Vorac
May 29 at 16:33
@Vorac, mmhm. edited.
â ilkkachu
May 29 at 16:44
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Absolutely no output is simple, just redirect the script's stdout
and stderr
to /dev/null
:
exec >/dev/null 2>&1
That will affect the shell itself, and any commands it executes afterwards. Run that conditionally to choose where the output is redirected.
if [ "$output_to_file" = 1 ]; then
exec > "$outputfilename" 2>&1
elif [ "$output_suppress" = 1 ]; then
exec > /dev/null 2>&1
fi
Note, that suppressing all output isn't probably a good idea. The user is very likely to want some notification for errors.
If you insist on passing output through the function (and are running Bash/ksh/Zsh), you could use process substitution:
#!/bin/bash
mangle_output()
# do something smarter here
while read -r line; do
echo "output: $line";
done;
# redirect stdout and stderr to the function
exec > >(mangle_output) 2>&1
echo something that produces output
Though note that processing the output with a shell loop isn't a very good idea, in the least it's slow. See: Why is using a shell loop to process text considered bad practice?. If all you want is redirection a file, or to /dev/null
, just use exec
to set up redirections.
Absolutely no output is simple, just redirect the script's stdout
and stderr
to /dev/null
:
exec >/dev/null 2>&1
That will affect the shell itself, and any commands it executes afterwards. Run that conditionally to choose where the output is redirected.
if [ "$output_to_file" = 1 ]; then
exec > "$outputfilename" 2>&1
elif [ "$output_suppress" = 1 ]; then
exec > /dev/null 2>&1
fi
Note, that suppressing all output isn't probably a good idea. The user is very likely to want some notification for errors.
If you insist on passing output through the function (and are running Bash/ksh/Zsh), you could use process substitution:
#!/bin/bash
mangle_output()
# do something smarter here
while read -r line; do
echo "output: $line";
done;
# redirect stdout and stderr to the function
exec > >(mangle_output) 2>&1
echo something that produces output
Though note that processing the output with a shell loop isn't a very good idea, in the least it's slow. See: Why is using a shell loop to process text considered bad practice?. If all you want is redirection a file, or to /dev/null
, just use exec
to set up redirections.
edited May 29 at 16:43
answered May 29 at 16:01
ilkkachu
47.8k668131
47.8k668131
Sorry for the ambiguous question. What I wanted to say, it that all output should be routed throughlog()
.
â Vorac
May 29 at 16:33
@Vorac, mmhm. edited.
â ilkkachu
May 29 at 16:44
add a comment |Â
Sorry for the ambiguous question. What I wanted to say, it that all output should be routed throughlog()
.
â Vorac
May 29 at 16:33
@Vorac, mmhm. edited.
â ilkkachu
May 29 at 16:44
Sorry for the ambiguous question. What I wanted to say, it that all output should be routed through
log()
.â Vorac
May 29 at 16:33
Sorry for the ambiguous question. What I wanted to say, it that all output should be routed through
log()
.â Vorac
May 29 at 16:33
@Vorac, mmhm. edited.
â ilkkachu
May 29 at 16:44
@Vorac, mmhm. edited.
â ilkkachu
May 29 at 16:44
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f446733%2fpump-command-output-as-function-argument%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password