Using grep after using find to get the files
Clash Royale CLAN TAG#URR8PPP
up vote
4
down vote
favorite
So I'm relatively new to command line. I was able to use find to get an output of multiple files from multiple directories since there was no specific place these would be (I'm sure this can be shortened):
find ./ -name filename1.ext && find ./ -name filename2.ext && find ./ -name filename3.ext
Now that gave me the list of what I was wanting but now that I've found the files in question, I want to grep them for information.
linux shell-script grep find
add a comment |Â
up vote
4
down vote
favorite
So I'm relatively new to command line. I was able to use find to get an output of multiple files from multiple directories since there was no specific place these would be (I'm sure this can be shortened):
find ./ -name filename1.ext && find ./ -name filename2.ext && find ./ -name filename3.ext
Now that gave me the list of what I was wanting but now that I've found the files in question, I want to grep them for information.
linux shell-script grep find
add a comment |Â
up vote
4
down vote
favorite
up vote
4
down vote
favorite
So I'm relatively new to command line. I was able to use find to get an output of multiple files from multiple directories since there was no specific place these would be (I'm sure this can be shortened):
find ./ -name filename1.ext && find ./ -name filename2.ext && find ./ -name filename3.ext
Now that gave me the list of what I was wanting but now that I've found the files in question, I want to grep them for information.
linux shell-script grep find
So I'm relatively new to command line. I was able to use find to get an output of multiple files from multiple directories since there was no specific place these would be (I'm sure this can be shortened):
find ./ -name filename1.ext && find ./ -name filename2.ext && find ./ -name filename3.ext
Now that gave me the list of what I was wanting but now that I've found the files in question, I want to grep them for information.
linux shell-script grep find
edited Jun 27 at 17:09
Jeff Schaller
30.8k846104
30.8k846104
asked Jun 27 at 17:08
trazinaz
262
262
add a comment |Â
add a comment |Â
5 Answers
5
active
oldest
votes
up vote
6
down vote
You can group all the name
primaries in a single find statement then have find execute grep.
find . ( -name filename1.ext -o
-name filename2.ext -o
-name filename3.ext )
-exec grep 'pattern' ;
Thank you so much for your help that makes a ton more sense to do this that way!
â trazinaz
Jun 27 at 17:29
3
You might want to consider-exec grep 'pattern' +
which bunches the filenames found and calls grep with multiple filenames, improving performance. That will affect the output (by default the lines matched are prefixed with the filename) -- if you don't want that you can add the-h
option to grep.
â glenn jackman
Jun 27 at 17:32
Yes that is way better for sorting the files the way I wanted the information.
â trazinaz
Jun 27 at 17:56
@glennjackman "improving performance" Not necessarily. It would if there are many many files that are quickly matched so that callinggrep
for each one is somewhat significant. However, it might be more likely that./
is a big tree and that there'll be long pauses between finding the different files. In that case, using;
instead of+
would improve performance as you'd get partial results as each file is found.
â JoL
Jun 27 at 18:35
Also look at options togrep
...-i
ignores the case of letters,-l
just output the filename (of files with match),-H
prefix matching lines with filename (to see what line(s) belongs to which file). Also look atfgrep
andegrep
.
â Baard Kopperud
Jun 27 at 18:49
add a comment |Â
up vote
4
down vote
Try this,
find ./ -type f ( -name filename1.ext -o -name filename2.ext -o -name filename3.ext ) -exec grep 'string' ;
-type f
since youâÂÂre looking for files, it's better to specify the type to get the result faster.-o
means OR, it enables you to add more filenames to the search
I appreciate that it helped me using this to make my combined command!
â trazinaz
Jun 27 at 17:30
add a comment |Â
up vote
1
down vote
Beside the correct answer is indeed using -o
and -exec
, here's a general way to capture the output of a previous command and parse it line by line
(find .... && find ... && cat ... && ls ... && ...) | while read line; do grep string $line; done
add a comment |Â
up vote
0
down vote
Using bash
:
shopt -s globstar
grep 'pattern' ./**/filename[123].ext
With the globstar
shell option enabled, the **
pattern behaves like *
, but matches across /
in pathnames. This will work unless the pattern matches thousands of files, in which case you are likely to get an "Argument list too long" error from the shell. This also does not check whether the matched pathnames are for regular files or not, like a find
would do with its -type f
test. Also. if the pattern does not match anything, it will remain unexpanded.
In a loop, which solves all three of the above mentioned issues:
shopt -s globstar
for pathname in ./**/filename[123].ext; do
[ -f "$pathname" ] && grep 'pattern' /dev/null "$pathname"
done
All the limitations you mention (and the fact that it skips hidden files) are easily addressed withzsh
where that**/
syntax comes from.
â Stéphane Chazelas
Jun 27 at 21:08
Also note that it would fail for files whose path starts with-
, and the loop approach won't print file names (for the non-loop one, that will depend on whether one or more files are found)
â Stéphane Chazelas
Jun 27 at 21:10
add a comment |Â
up vote
0
down vote
In general there are at least three ways to do a find
+ grep
combination:
grep
pattern`find
dir find-specifiers-print`
find
dir find-specifiers-exec grep
pattern;
find
dir find-specifiers-print | xargs grep
pattern
And of course there's nothing special about grep
here; these same three patterns could be used for find
plus any command.
Number 1 is, in a sense, the oldest and most basic way, since backquotes have always been the way to capture the output of one command and use it on the command line of another. (These days, I get the impression that there's a newer bashism that's better than backquotes and that all the cool kids use, but I guess I'm an old-timer.) The disadvantage of number 1 is that if find
finds lots of files, you may get the error "Command line too long".
Number 2 is a special feature built into find
for doing a find
+ command combination. It's fine as far as it goes, but it has two disadvantages: (1) it re-invokes the auxiliary command (grep
or whatever) for each file found, so it can be slow, and (2) if the auxiliary command is grep
, since each invocation of grep
sees one filename it won't list the filenames in the match, although you can work around that by doing -exec grep
pattern /dev/null ;
or, these days, -exec grep -H
pattern ;
.
And then there's number 3. As far as I know, xargs
was invented to get around the limitations of the first two. Although xargs
is theoretically a general-purpose program, I suspect in practice it's hardly ever used with any pair of programs other than find
and grep
. It thoroughly works around the disadvantage of #1; it'll work with arbitrary numbers of found files. It's efficient, although if you're unlucky it'll occasionally invoke grep
one one last filename, meaning that you'll still want to use the /dev/null
or -H
trick. And it's got a disadvantage of its own: it doesn't work if any of the found filenames contain whitespace. But there's a way around that, too:
find
dir find-specifiers -print0 | xargs -0 grep
pattern
(I dearly wish xargs
had been written to accept newline-separated filenames on its input by default instead of whitespace, but that's a rant for another day.)
add a comment |Â
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
6
down vote
You can group all the name
primaries in a single find statement then have find execute grep.
find . ( -name filename1.ext -o
-name filename2.ext -o
-name filename3.ext )
-exec grep 'pattern' ;
Thank you so much for your help that makes a ton more sense to do this that way!
â trazinaz
Jun 27 at 17:29
3
You might want to consider-exec grep 'pattern' +
which bunches the filenames found and calls grep with multiple filenames, improving performance. That will affect the output (by default the lines matched are prefixed with the filename) -- if you don't want that you can add the-h
option to grep.
â glenn jackman
Jun 27 at 17:32
Yes that is way better for sorting the files the way I wanted the information.
â trazinaz
Jun 27 at 17:56
@glennjackman "improving performance" Not necessarily. It would if there are many many files that are quickly matched so that callinggrep
for each one is somewhat significant. However, it might be more likely that./
is a big tree and that there'll be long pauses between finding the different files. In that case, using;
instead of+
would improve performance as you'd get partial results as each file is found.
â JoL
Jun 27 at 18:35
Also look at options togrep
...-i
ignores the case of letters,-l
just output the filename (of files with match),-H
prefix matching lines with filename (to see what line(s) belongs to which file). Also look atfgrep
andegrep
.
â Baard Kopperud
Jun 27 at 18:49
add a comment |Â
up vote
6
down vote
You can group all the name
primaries in a single find statement then have find execute grep.
find . ( -name filename1.ext -o
-name filename2.ext -o
-name filename3.ext )
-exec grep 'pattern' ;
Thank you so much for your help that makes a ton more sense to do this that way!
â trazinaz
Jun 27 at 17:29
3
You might want to consider-exec grep 'pattern' +
which bunches the filenames found and calls grep with multiple filenames, improving performance. That will affect the output (by default the lines matched are prefixed with the filename) -- if you don't want that you can add the-h
option to grep.
â glenn jackman
Jun 27 at 17:32
Yes that is way better for sorting the files the way I wanted the information.
â trazinaz
Jun 27 at 17:56
@glennjackman "improving performance" Not necessarily. It would if there are many many files that are quickly matched so that callinggrep
for each one is somewhat significant. However, it might be more likely that./
is a big tree and that there'll be long pauses between finding the different files. In that case, using;
instead of+
would improve performance as you'd get partial results as each file is found.
â JoL
Jun 27 at 18:35
Also look at options togrep
...-i
ignores the case of letters,-l
just output the filename (of files with match),-H
prefix matching lines with filename (to see what line(s) belongs to which file). Also look atfgrep
andegrep
.
â Baard Kopperud
Jun 27 at 18:49
add a comment |Â
up vote
6
down vote
up vote
6
down vote
You can group all the name
primaries in a single find statement then have find execute grep.
find . ( -name filename1.ext -o
-name filename2.ext -o
-name filename3.ext )
-exec grep 'pattern' ;
You can group all the name
primaries in a single find statement then have find execute grep.
find . ( -name filename1.ext -o
-name filename2.ext -o
-name filename3.ext )
-exec grep 'pattern' ;
edited Jun 27 at 17:25
Kusalananda
101k13199312
101k13199312
answered Jun 27 at 17:19
fd0
1,1231510
1,1231510
Thank you so much for your help that makes a ton more sense to do this that way!
â trazinaz
Jun 27 at 17:29
3
You might want to consider-exec grep 'pattern' +
which bunches the filenames found and calls grep with multiple filenames, improving performance. That will affect the output (by default the lines matched are prefixed with the filename) -- if you don't want that you can add the-h
option to grep.
â glenn jackman
Jun 27 at 17:32
Yes that is way better for sorting the files the way I wanted the information.
â trazinaz
Jun 27 at 17:56
@glennjackman "improving performance" Not necessarily. It would if there are many many files that are quickly matched so that callinggrep
for each one is somewhat significant. However, it might be more likely that./
is a big tree and that there'll be long pauses between finding the different files. In that case, using;
instead of+
would improve performance as you'd get partial results as each file is found.
â JoL
Jun 27 at 18:35
Also look at options togrep
...-i
ignores the case of letters,-l
just output the filename (of files with match),-H
prefix matching lines with filename (to see what line(s) belongs to which file). Also look atfgrep
andegrep
.
â Baard Kopperud
Jun 27 at 18:49
add a comment |Â
Thank you so much for your help that makes a ton more sense to do this that way!
â trazinaz
Jun 27 at 17:29
3
You might want to consider-exec grep 'pattern' +
which bunches the filenames found and calls grep with multiple filenames, improving performance. That will affect the output (by default the lines matched are prefixed with the filename) -- if you don't want that you can add the-h
option to grep.
â glenn jackman
Jun 27 at 17:32
Yes that is way better for sorting the files the way I wanted the information.
â trazinaz
Jun 27 at 17:56
@glennjackman "improving performance" Not necessarily. It would if there are many many files that are quickly matched so that callinggrep
for each one is somewhat significant. However, it might be more likely that./
is a big tree and that there'll be long pauses between finding the different files. In that case, using;
instead of+
would improve performance as you'd get partial results as each file is found.
â JoL
Jun 27 at 18:35
Also look at options togrep
...-i
ignores the case of letters,-l
just output the filename (of files with match),-H
prefix matching lines with filename (to see what line(s) belongs to which file). Also look atfgrep
andegrep
.
â Baard Kopperud
Jun 27 at 18:49
Thank you so much for your help that makes a ton more sense to do this that way!
â trazinaz
Jun 27 at 17:29
Thank you so much for your help that makes a ton more sense to do this that way!
â trazinaz
Jun 27 at 17:29
3
3
You might want to consider
-exec grep 'pattern' +
which bunches the filenames found and calls grep with multiple filenames, improving performance. That will affect the output (by default the lines matched are prefixed with the filename) -- if you don't want that you can add the -h
option to grep.â glenn jackman
Jun 27 at 17:32
You might want to consider
-exec grep 'pattern' +
which bunches the filenames found and calls grep with multiple filenames, improving performance. That will affect the output (by default the lines matched are prefixed with the filename) -- if you don't want that you can add the -h
option to grep.â glenn jackman
Jun 27 at 17:32
Yes that is way better for sorting the files the way I wanted the information.
â trazinaz
Jun 27 at 17:56
Yes that is way better for sorting the files the way I wanted the information.
â trazinaz
Jun 27 at 17:56
@glennjackman "improving performance" Not necessarily. It would if there are many many files that are quickly matched so that calling
grep
for each one is somewhat significant. However, it might be more likely that ./
is a big tree and that there'll be long pauses between finding the different files. In that case, using ;
instead of +
would improve performance as you'd get partial results as each file is found.â JoL
Jun 27 at 18:35
@glennjackman "improving performance" Not necessarily. It would if there are many many files that are quickly matched so that calling
grep
for each one is somewhat significant. However, it might be more likely that ./
is a big tree and that there'll be long pauses between finding the different files. In that case, using ;
instead of +
would improve performance as you'd get partial results as each file is found.â JoL
Jun 27 at 18:35
Also look at options to
grep
... -i
ignores the case of letters, -l
just output the filename (of files with match), -H
prefix matching lines with filename (to see what line(s) belongs to which file). Also look at fgrep
and egrep
.â Baard Kopperud
Jun 27 at 18:49
Also look at options to
grep
... -i
ignores the case of letters, -l
just output the filename (of files with match), -H
prefix matching lines with filename (to see what line(s) belongs to which file). Also look at fgrep
and egrep
.â Baard Kopperud
Jun 27 at 18:49
add a comment |Â
up vote
4
down vote
Try this,
find ./ -type f ( -name filename1.ext -o -name filename2.ext -o -name filename3.ext ) -exec grep 'string' ;
-type f
since youâÂÂre looking for files, it's better to specify the type to get the result faster.-o
means OR, it enables you to add more filenames to the search
I appreciate that it helped me using this to make my combined command!
â trazinaz
Jun 27 at 17:30
add a comment |Â
up vote
4
down vote
Try this,
find ./ -type f ( -name filename1.ext -o -name filename2.ext -o -name filename3.ext ) -exec grep 'string' ;
-type f
since youâÂÂre looking for files, it's better to specify the type to get the result faster.-o
means OR, it enables you to add more filenames to the search
I appreciate that it helped me using this to make my combined command!
â trazinaz
Jun 27 at 17:30
add a comment |Â
up vote
4
down vote
up vote
4
down vote
Try this,
find ./ -type f ( -name filename1.ext -o -name filename2.ext -o -name filename3.ext ) -exec grep 'string' ;
-type f
since youâÂÂre looking for files, it's better to specify the type to get the result faster.-o
means OR, it enables you to add more filenames to the search
Try this,
find ./ -type f ( -name filename1.ext -o -name filename2.ext -o -name filename3.ext ) -exec grep 'string' ;
-type f
since youâÂÂre looking for files, it's better to specify the type to get the result faster.-o
means OR, it enables you to add more filenames to the search
edited Jun 27 at 17:36
answered Jun 27 at 17:18
SivaPrasath
3,88611737
3,88611737
I appreciate that it helped me using this to make my combined command!
â trazinaz
Jun 27 at 17:30
add a comment |Â
I appreciate that it helped me using this to make my combined command!
â trazinaz
Jun 27 at 17:30
I appreciate that it helped me using this to make my combined command!
â trazinaz
Jun 27 at 17:30
I appreciate that it helped me using this to make my combined command!
â trazinaz
Jun 27 at 17:30
add a comment |Â
up vote
1
down vote
Beside the correct answer is indeed using -o
and -exec
, here's a general way to capture the output of a previous command and parse it line by line
(find .... && find ... && cat ... && ls ... && ...) | while read line; do grep string $line; done
add a comment |Â
up vote
1
down vote
Beside the correct answer is indeed using -o
and -exec
, here's a general way to capture the output of a previous command and parse it line by line
(find .... && find ... && cat ... && ls ... && ...) | while read line; do grep string $line; done
add a comment |Â
up vote
1
down vote
up vote
1
down vote
Beside the correct answer is indeed using -o
and -exec
, here's a general way to capture the output of a previous command and parse it line by line
(find .... && find ... && cat ... && ls ... && ...) | while read line; do grep string $line; done
Beside the correct answer is indeed using -o
and -exec
, here's a general way to capture the output of a previous command and parse it line by line
(find .... && find ... && cat ... && ls ... && ...) | while read line; do grep string $line; done
answered Jun 28 at 9:19
Jack
1112
1112
add a comment |Â
add a comment |Â
up vote
0
down vote
Using bash
:
shopt -s globstar
grep 'pattern' ./**/filename[123].ext
With the globstar
shell option enabled, the **
pattern behaves like *
, but matches across /
in pathnames. This will work unless the pattern matches thousands of files, in which case you are likely to get an "Argument list too long" error from the shell. This also does not check whether the matched pathnames are for regular files or not, like a find
would do with its -type f
test. Also. if the pattern does not match anything, it will remain unexpanded.
In a loop, which solves all three of the above mentioned issues:
shopt -s globstar
for pathname in ./**/filename[123].ext; do
[ -f "$pathname" ] && grep 'pattern' /dev/null "$pathname"
done
All the limitations you mention (and the fact that it skips hidden files) are easily addressed withzsh
where that**/
syntax comes from.
â Stéphane Chazelas
Jun 27 at 21:08
Also note that it would fail for files whose path starts with-
, and the loop approach won't print file names (for the non-loop one, that will depend on whether one or more files are found)
â Stéphane Chazelas
Jun 27 at 21:10
add a comment |Â
up vote
0
down vote
Using bash
:
shopt -s globstar
grep 'pattern' ./**/filename[123].ext
With the globstar
shell option enabled, the **
pattern behaves like *
, but matches across /
in pathnames. This will work unless the pattern matches thousands of files, in which case you are likely to get an "Argument list too long" error from the shell. This also does not check whether the matched pathnames are for regular files or not, like a find
would do with its -type f
test. Also. if the pattern does not match anything, it will remain unexpanded.
In a loop, which solves all three of the above mentioned issues:
shopt -s globstar
for pathname in ./**/filename[123].ext; do
[ -f "$pathname" ] && grep 'pattern' /dev/null "$pathname"
done
All the limitations you mention (and the fact that it skips hidden files) are easily addressed withzsh
where that**/
syntax comes from.
â Stéphane Chazelas
Jun 27 at 21:08
Also note that it would fail for files whose path starts with-
, and the loop approach won't print file names (for the non-loop one, that will depend on whether one or more files are found)
â Stéphane Chazelas
Jun 27 at 21:10
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Using bash
:
shopt -s globstar
grep 'pattern' ./**/filename[123].ext
With the globstar
shell option enabled, the **
pattern behaves like *
, but matches across /
in pathnames. This will work unless the pattern matches thousands of files, in which case you are likely to get an "Argument list too long" error from the shell. This also does not check whether the matched pathnames are for regular files or not, like a find
would do with its -type f
test. Also. if the pattern does not match anything, it will remain unexpanded.
In a loop, which solves all three of the above mentioned issues:
shopt -s globstar
for pathname in ./**/filename[123].ext; do
[ -f "$pathname" ] && grep 'pattern' /dev/null "$pathname"
done
Using bash
:
shopt -s globstar
grep 'pattern' ./**/filename[123].ext
With the globstar
shell option enabled, the **
pattern behaves like *
, but matches across /
in pathnames. This will work unless the pattern matches thousands of files, in which case you are likely to get an "Argument list too long" error from the shell. This also does not check whether the matched pathnames are for regular files or not, like a find
would do with its -type f
test. Also. if the pattern does not match anything, it will remain unexpanded.
In a loop, which solves all three of the above mentioned issues:
shopt -s globstar
for pathname in ./**/filename[123].ext; do
[ -f "$pathname" ] && grep 'pattern' /dev/null "$pathname"
done
edited Jun 28 at 9:25
answered Jun 27 at 18:22
Kusalananda
101k13199312
101k13199312
All the limitations you mention (and the fact that it skips hidden files) are easily addressed withzsh
where that**/
syntax comes from.
â Stéphane Chazelas
Jun 27 at 21:08
Also note that it would fail for files whose path starts with-
, and the loop approach won't print file names (for the non-loop one, that will depend on whether one or more files are found)
â Stéphane Chazelas
Jun 27 at 21:10
add a comment |Â
All the limitations you mention (and the fact that it skips hidden files) are easily addressed withzsh
where that**/
syntax comes from.
â Stéphane Chazelas
Jun 27 at 21:08
Also note that it would fail for files whose path starts with-
, and the loop approach won't print file names (for the non-loop one, that will depend on whether one or more files are found)
â Stéphane Chazelas
Jun 27 at 21:10
All the limitations you mention (and the fact that it skips hidden files) are easily addressed with
zsh
where that **/
syntax comes from.â Stéphane Chazelas
Jun 27 at 21:08
All the limitations you mention (and the fact that it skips hidden files) are easily addressed with
zsh
where that **/
syntax comes from.â Stéphane Chazelas
Jun 27 at 21:08
Also note that it would fail for files whose path starts with
-
, and the loop approach won't print file names (for the non-loop one, that will depend on whether one or more files are found)â Stéphane Chazelas
Jun 27 at 21:10
Also note that it would fail for files whose path starts with
-
, and the loop approach won't print file names (for the non-loop one, that will depend on whether one or more files are found)â Stéphane Chazelas
Jun 27 at 21:10
add a comment |Â
up vote
0
down vote
In general there are at least three ways to do a find
+ grep
combination:
grep
pattern`find
dir find-specifiers-print`
find
dir find-specifiers-exec grep
pattern;
find
dir find-specifiers-print | xargs grep
pattern
And of course there's nothing special about grep
here; these same three patterns could be used for find
plus any command.
Number 1 is, in a sense, the oldest and most basic way, since backquotes have always been the way to capture the output of one command and use it on the command line of another. (These days, I get the impression that there's a newer bashism that's better than backquotes and that all the cool kids use, but I guess I'm an old-timer.) The disadvantage of number 1 is that if find
finds lots of files, you may get the error "Command line too long".
Number 2 is a special feature built into find
for doing a find
+ command combination. It's fine as far as it goes, but it has two disadvantages: (1) it re-invokes the auxiliary command (grep
or whatever) for each file found, so it can be slow, and (2) if the auxiliary command is grep
, since each invocation of grep
sees one filename it won't list the filenames in the match, although you can work around that by doing -exec grep
pattern /dev/null ;
or, these days, -exec grep -H
pattern ;
.
And then there's number 3. As far as I know, xargs
was invented to get around the limitations of the first two. Although xargs
is theoretically a general-purpose program, I suspect in practice it's hardly ever used with any pair of programs other than find
and grep
. It thoroughly works around the disadvantage of #1; it'll work with arbitrary numbers of found files. It's efficient, although if you're unlucky it'll occasionally invoke grep
one one last filename, meaning that you'll still want to use the /dev/null
or -H
trick. And it's got a disadvantage of its own: it doesn't work if any of the found filenames contain whitespace. But there's a way around that, too:
find
dir find-specifiers -print0 | xargs -0 grep
pattern
(I dearly wish xargs
had been written to accept newline-separated filenames on its input by default instead of whitespace, but that's a rant for another day.)
add a comment |Â
up vote
0
down vote
In general there are at least three ways to do a find
+ grep
combination:
grep
pattern`find
dir find-specifiers-print`
find
dir find-specifiers-exec grep
pattern;
find
dir find-specifiers-print | xargs grep
pattern
And of course there's nothing special about grep
here; these same three patterns could be used for find
plus any command.
Number 1 is, in a sense, the oldest and most basic way, since backquotes have always been the way to capture the output of one command and use it on the command line of another. (These days, I get the impression that there's a newer bashism that's better than backquotes and that all the cool kids use, but I guess I'm an old-timer.) The disadvantage of number 1 is that if find
finds lots of files, you may get the error "Command line too long".
Number 2 is a special feature built into find
for doing a find
+ command combination. It's fine as far as it goes, but it has two disadvantages: (1) it re-invokes the auxiliary command (grep
or whatever) for each file found, so it can be slow, and (2) if the auxiliary command is grep
, since each invocation of grep
sees one filename it won't list the filenames in the match, although you can work around that by doing -exec grep
pattern /dev/null ;
or, these days, -exec grep -H
pattern ;
.
And then there's number 3. As far as I know, xargs
was invented to get around the limitations of the first two. Although xargs
is theoretically a general-purpose program, I suspect in practice it's hardly ever used with any pair of programs other than find
and grep
. It thoroughly works around the disadvantage of #1; it'll work with arbitrary numbers of found files. It's efficient, although if you're unlucky it'll occasionally invoke grep
one one last filename, meaning that you'll still want to use the /dev/null
or -H
trick. And it's got a disadvantage of its own: it doesn't work if any of the found filenames contain whitespace. But there's a way around that, too:
find
dir find-specifiers -print0 | xargs -0 grep
pattern
(I dearly wish xargs
had been written to accept newline-separated filenames on its input by default instead of whitespace, but that's a rant for another day.)
add a comment |Â
up vote
0
down vote
up vote
0
down vote
In general there are at least three ways to do a find
+ grep
combination:
grep
pattern`find
dir find-specifiers-print`
find
dir find-specifiers-exec grep
pattern;
find
dir find-specifiers-print | xargs grep
pattern
And of course there's nothing special about grep
here; these same three patterns could be used for find
plus any command.
Number 1 is, in a sense, the oldest and most basic way, since backquotes have always been the way to capture the output of one command and use it on the command line of another. (These days, I get the impression that there's a newer bashism that's better than backquotes and that all the cool kids use, but I guess I'm an old-timer.) The disadvantage of number 1 is that if find
finds lots of files, you may get the error "Command line too long".
Number 2 is a special feature built into find
for doing a find
+ command combination. It's fine as far as it goes, but it has two disadvantages: (1) it re-invokes the auxiliary command (grep
or whatever) for each file found, so it can be slow, and (2) if the auxiliary command is grep
, since each invocation of grep
sees one filename it won't list the filenames in the match, although you can work around that by doing -exec grep
pattern /dev/null ;
or, these days, -exec grep -H
pattern ;
.
And then there's number 3. As far as I know, xargs
was invented to get around the limitations of the first two. Although xargs
is theoretically a general-purpose program, I suspect in practice it's hardly ever used with any pair of programs other than find
and grep
. It thoroughly works around the disadvantage of #1; it'll work with arbitrary numbers of found files. It's efficient, although if you're unlucky it'll occasionally invoke grep
one one last filename, meaning that you'll still want to use the /dev/null
or -H
trick. And it's got a disadvantage of its own: it doesn't work if any of the found filenames contain whitespace. But there's a way around that, too:
find
dir find-specifiers -print0 | xargs -0 grep
pattern
(I dearly wish xargs
had been written to accept newline-separated filenames on its input by default instead of whitespace, but that's a rant for another day.)
In general there are at least three ways to do a find
+ grep
combination:
grep
pattern`find
dir find-specifiers-print`
find
dir find-specifiers-exec grep
pattern;
find
dir find-specifiers-print | xargs grep
pattern
And of course there's nothing special about grep
here; these same three patterns could be used for find
plus any command.
Number 1 is, in a sense, the oldest and most basic way, since backquotes have always been the way to capture the output of one command and use it on the command line of another. (These days, I get the impression that there's a newer bashism that's better than backquotes and that all the cool kids use, but I guess I'm an old-timer.) The disadvantage of number 1 is that if find
finds lots of files, you may get the error "Command line too long".
Number 2 is a special feature built into find
for doing a find
+ command combination. It's fine as far as it goes, but it has two disadvantages: (1) it re-invokes the auxiliary command (grep
or whatever) for each file found, so it can be slow, and (2) if the auxiliary command is grep
, since each invocation of grep
sees one filename it won't list the filenames in the match, although you can work around that by doing -exec grep
pattern /dev/null ;
or, these days, -exec grep -H
pattern ;
.
And then there's number 3. As far as I know, xargs
was invented to get around the limitations of the first two. Although xargs
is theoretically a general-purpose program, I suspect in practice it's hardly ever used with any pair of programs other than find
and grep
. It thoroughly works around the disadvantage of #1; it'll work with arbitrary numbers of found files. It's efficient, although if you're unlucky it'll occasionally invoke grep
one one last filename, meaning that you'll still want to use the /dev/null
or -H
trick. And it's got a disadvantage of its own: it doesn't work if any of the found filenames contain whitespace. But there's a way around that, too:
find
dir find-specifiers -print0 | xargs -0 grep
pattern
(I dearly wish xargs
had been written to accept newline-separated filenames on its input by default instead of whitespace, but that's a rant for another day.)
edited Jun 28 at 10:27
answered Jun 28 at 10:06
Steve Summit
1436
1436
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f452276%2fusing-grep-after-using-find-to-get-the-files%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password