Limiting recursive 'aws s3 ls' searches by number of items in the folder
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
0
down vote
favorite
Here's the scenario:
I need to recursively search through tons and tons of folders and subfolders and spit out the results to a log file using the ls command, BUT, I need to stop searching a folder if it has more than 10~ objects? The reason being is once I have a sample of 10 items in a folder, I know what's in the folder, and since some folders contain tens of thousands of results, this will save lots of time.
Why are you limited to 'ls'?
Because I am searching S3, using the command aws s3 ls
. So far aws s3 ls --summarixe --recursive
does what I need, I just now need a way to limit the search based on the number of items in a folder.
I have tried using aws s3api list-buckets & list-objects and so forth, but even with the --max-values tag it doesn't do what I need. Thanks for your help.
ls recursive aws amazon-s3
add a comment |Â
up vote
0
down vote
favorite
Here's the scenario:
I need to recursively search through tons and tons of folders and subfolders and spit out the results to a log file using the ls command, BUT, I need to stop searching a folder if it has more than 10~ objects? The reason being is once I have a sample of 10 items in a folder, I know what's in the folder, and since some folders contain tens of thousands of results, this will save lots of time.
Why are you limited to 'ls'?
Because I am searching S3, using the command aws s3 ls
. So far aws s3 ls --summarixe --recursive
does what I need, I just now need a way to limit the search based on the number of items in a folder.
I have tried using aws s3api list-buckets & list-objects and so forth, but even with the --max-values tag it doesn't do what I need. Thanks for your help.
ls recursive aws amazon-s3
2
It's not reallyls
is it.
â roaima
Aug 3 at 17:41
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Here's the scenario:
I need to recursively search through tons and tons of folders and subfolders and spit out the results to a log file using the ls command, BUT, I need to stop searching a folder if it has more than 10~ objects? The reason being is once I have a sample of 10 items in a folder, I know what's in the folder, and since some folders contain tens of thousands of results, this will save lots of time.
Why are you limited to 'ls'?
Because I am searching S3, using the command aws s3 ls
. So far aws s3 ls --summarixe --recursive
does what I need, I just now need a way to limit the search based on the number of items in a folder.
I have tried using aws s3api list-buckets & list-objects and so forth, but even with the --max-values tag it doesn't do what I need. Thanks for your help.
ls recursive aws amazon-s3
Here's the scenario:
I need to recursively search through tons and tons of folders and subfolders and spit out the results to a log file using the ls command, BUT, I need to stop searching a folder if it has more than 10~ objects? The reason being is once I have a sample of 10 items in a folder, I know what's in the folder, and since some folders contain tens of thousands of results, this will save lots of time.
Why are you limited to 'ls'?
Because I am searching S3, using the command aws s3 ls
. So far aws s3 ls --summarixe --recursive
does what I need, I just now need a way to limit the search based on the number of items in a folder.
I have tried using aws s3api list-buckets & list-objects and so forth, but even with the --max-values tag it doesn't do what I need. Thanks for your help.
ls recursive aws amazon-s3
edited Aug 3 at 18:20
L. Scott Johnson
1182
1182
asked Aug 3 at 17:30
Sidereal
1
1
2
It's not reallyls
is it.
â roaima
Aug 3 at 17:41
add a comment |Â
2
It's not reallyls
is it.
â roaima
Aug 3 at 17:41
2
2
It's not really
ls
is it.â roaima
Aug 3 at 17:41
It's not really
ls
is it.â roaima
Aug 3 at 17:41
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
0
down vote
roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.
I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.
I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.
add a comment |Â
up vote
0
down vote
roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.
I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.
I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.
roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.
I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.
answered Aug 3 at 18:04
Pat Gunn
1212
1212
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f460391%2flimiting-recursive-aws-s3-ls-searches-by-number-of-items-in-the-folder%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
2
It's not really
ls
is it.â roaima
Aug 3 at 17:41