Limiting recursive 'aws s3 ls' searches by number of items in the folder

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
0
down vote

favorite












Here's the scenario:



I need to recursively search through tons and tons of folders and subfolders and spit out the results to a log file using the ls command, BUT, I need to stop searching a folder if it has more than 10~ objects? The reason being is once I have a sample of 10 items in a folder, I know what's in the folder, and since some folders contain tens of thousands of results, this will save lots of time.



Why are you limited to 'ls'?
Because I am searching S3, using the command aws s3 ls. So far aws s3 ls --summarixe --recursive does what I need, I just now need a way to limit the search based on the number of items in a folder.



I have tried using aws s3api list-buckets & list-objects and so forth, but even with the --max-values tag it doesn't do what I need. Thanks for your help.







share|improve this question

















  • 2




    It's not really ls is it.
    – roaima
    Aug 3 at 17:41
















up vote
0
down vote

favorite












Here's the scenario:



I need to recursively search through tons and tons of folders and subfolders and spit out the results to a log file using the ls command, BUT, I need to stop searching a folder if it has more than 10~ objects? The reason being is once I have a sample of 10 items in a folder, I know what's in the folder, and since some folders contain tens of thousands of results, this will save lots of time.



Why are you limited to 'ls'?
Because I am searching S3, using the command aws s3 ls. So far aws s3 ls --summarixe --recursive does what I need, I just now need a way to limit the search based on the number of items in a folder.



I have tried using aws s3api list-buckets & list-objects and so forth, but even with the --max-values tag it doesn't do what I need. Thanks for your help.







share|improve this question

















  • 2




    It's not really ls is it.
    – roaima
    Aug 3 at 17:41












up vote
0
down vote

favorite









up vote
0
down vote

favorite











Here's the scenario:



I need to recursively search through tons and tons of folders and subfolders and spit out the results to a log file using the ls command, BUT, I need to stop searching a folder if it has more than 10~ objects? The reason being is once I have a sample of 10 items in a folder, I know what's in the folder, and since some folders contain tens of thousands of results, this will save lots of time.



Why are you limited to 'ls'?
Because I am searching S3, using the command aws s3 ls. So far aws s3 ls --summarixe --recursive does what I need, I just now need a way to limit the search based on the number of items in a folder.



I have tried using aws s3api list-buckets & list-objects and so forth, but even with the --max-values tag it doesn't do what I need. Thanks for your help.







share|improve this question













Here's the scenario:



I need to recursively search through tons and tons of folders and subfolders and spit out the results to a log file using the ls command, BUT, I need to stop searching a folder if it has more than 10~ objects? The reason being is once I have a sample of 10 items in a folder, I know what's in the folder, and since some folders contain tens of thousands of results, this will save lots of time.



Why are you limited to 'ls'?
Because I am searching S3, using the command aws s3 ls. So far aws s3 ls --summarixe --recursive does what I need, I just now need a way to limit the search based on the number of items in a folder.



I have tried using aws s3api list-buckets & list-objects and so forth, but even with the --max-values tag it doesn't do what I need. Thanks for your help.









share|improve this question












share|improve this question




share|improve this question








edited Aug 3 at 18:20









L. Scott Johnson

1182




1182









asked Aug 3 at 17:30









Sidereal

1




1







  • 2




    It's not really ls is it.
    – roaima
    Aug 3 at 17:41












  • 2




    It's not really ls is it.
    – roaima
    Aug 3 at 17:41







2




2




It's not really ls is it.
– roaima
Aug 3 at 17:41




It's not really ls is it.
– roaima
Aug 3 at 17:41










1 Answer
1






active

oldest

votes

















up vote
0
down vote













roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.



I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.






share|improve this answer





















    Your Answer







    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "106"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: false,
    noModals: false,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );








     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f460391%2flimiting-recursive-aws-s3-ls-searches-by-number-of-items-in-the-folder%23new-answer', 'question_page');

    );

    Post as a guest






























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.



    I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.






    share|improve this answer

























      up vote
      0
      down vote













      roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.



      I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.






      share|improve this answer























        up vote
        0
        down vote










        up vote
        0
        down vote









        roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.



        I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.






        share|improve this answer













        roaima is correct; "aws s3 ls" is named that way so people know the kind of work it does (list files). It's not really the ls command and just tries to behave the same at a very, very surface level.



        I believe the best way to do what you want would be to write a script using the boto3 API; s3.client.list_objects() has a MaxKeys argument that you can use to limit retrievals within a path.







        share|improve this answer













        share|improve this answer



        share|improve this answer











        answered Aug 3 at 18:04









        Pat Gunn

        1212




        1212






















             

            draft saved


            draft discarded


























             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f460391%2flimiting-recursive-aws-s3-ls-searches-by-number-of-items-in-the-folder%23new-answer', 'question_page');

            );

            Post as a guest













































































            Popular posts from this blog

            How to check contact read email or not when send email to Individual?

            Displaying single band from multi-band raster using QGIS

            How many registers does an x86_64 CPU actually have?