Copy files to aws s3 bucket using Ansible

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








1















My plan is to copy file from ec2 to s3 bucket using ansible, Here I've make playbook but getting some error:



copy2s3.yml



---
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no


Getting below error:



$ ansible-playbook copy2s3.yml -i 172.18.2.12,

ERROR! 's3' is not a valid attribute for a Play

The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
be elsewhere in the file depending on the exact syntax problem.

The offending line appears to be:

---
- name: Copy to s3
^ here









share|improve this question




























    1















    My plan is to copy file from ec2 to s3 bucket using ansible, Here I've make playbook but getting some error:



    copy2s3.yml



    ---
    - name: Copy to s3
    s3:
    aws_access_key: " lookup('env','aws_key') "
    aws_secret_key: " lookup('env','aws_secret') "
    bucket: " aws_packages_bucket "
    object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
    dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
    mode: get
    overwrite: no


    Getting below error:



    $ ansible-playbook copy2s3.yml -i 172.18.2.12,

    ERROR! 's3' is not a valid attribute for a Play

    The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
    be elsewhere in the file depending on the exact syntax problem.

    The offending line appears to be:

    ---
    - name: Copy to s3
    ^ here









    share|improve this question
























      1












      1








      1








      My plan is to copy file from ec2 to s3 bucket using ansible, Here I've make playbook but getting some error:



      copy2s3.yml



      ---
      - name: Copy to s3
      s3:
      aws_access_key: " lookup('env','aws_key') "
      aws_secret_key: " lookup('env','aws_secret') "
      bucket: " aws_packages_bucket "
      object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
      dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
      mode: get
      overwrite: no


      Getting below error:



      $ ansible-playbook copy2s3.yml -i 172.18.2.12,

      ERROR! 's3' is not a valid attribute for a Play

      The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
      be elsewhere in the file depending on the exact syntax problem.

      The offending line appears to be:

      ---
      - name: Copy to s3
      ^ here









      share|improve this question














      My plan is to copy file from ec2 to s3 bucket using ansible, Here I've make playbook but getting some error:



      copy2s3.yml



      ---
      - name: Copy to s3
      s3:
      aws_access_key: " lookup('env','aws_key') "
      aws_secret_key: " lookup('env','aws_secret') "
      bucket: " aws_packages_bucket "
      object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
      dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
      mode: get
      overwrite: no


      Getting below error:



      $ ansible-playbook copy2s3.yml -i 172.18.2.12,

      ERROR! 's3' is not a valid attribute for a Play

      The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
      be elsewhere in the file depending on the exact syntax problem.

      The offending line appears to be:

      ---
      - name: Copy to s3
      ^ here






      aws ansible






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Apr 3 '17 at 13:27









      NullpointerNullpointer

      2681416




      2681416




















          4 Answers
          4






          active

          oldest

          votes


















          2














          Module name (s3) should be at the same indentation level as the name:



          - name: Copy to s3
          s3:
          aws_access_key: " lookup('env','aws_key') "
          aws_secret_key: " lookup('env','aws_secret') "
          bucket: " aws_packages_bucket "
          object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
          dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
          mode: get
          overwrite: no





          share|improve this answer























          • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

            – Nullpointer
            Apr 4 '17 at 4:33












          • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

            – Marko Živanović
            Apr 4 '17 at 9:20











          • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

            – Nullpointer
            Apr 4 '17 at 9:25












          • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

            – Marko Živanović
            Apr 4 '17 at 9:29


















          0














          To Copy Object from Local Server to S3 using Ansible module, Use



          mode: put


          get will be used to download the object.



          Reference






          share|improve this answer
































            0














            I had a similar issue when using aws_s3, the replacement module for s3.



            Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



            I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



            To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



            13:20 $ python
            Python 2.7.12 (default, Nov 19 2016, 06:48:10)
            [GCC 5.4.0 20160609] on linux2
            Type "help", "copyright", "credits" or "license" for more information.
            >>> import boto
            >>> import boto3
            >>>
            13:21 $ python3
            Python 3.5.2 (default, Sep 14 2017, 22:51:06)
            [GCC 5.4.0 20160609] on linux
            Type "help", "copyright", "credits" or "license" for more information.
            >>> import boto
            >>> import boto3
            >>>


            If you receive an error in python then you get the error in Ansible.






            share|improve this answer






























              0














              Add task: above - name like this:



              ---
              - hosts: localhost
              tasks:
              - name: Copy to s3
              s3:
              aws_access_key: " lookup('env','aws_key') "
              aws_secret_key: " lookup('env','aws_secret') "
              bucket: " aws_packages_bucket "
              object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
              dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
              mode: put
              overwrite: no





              share|improve this answer

























                Your Answer








                StackExchange.ready(function()
                var channelOptions =
                tags: "".split(" "),
                id: "106"
                ;
                initTagRenderer("".split(" "), "".split(" "), channelOptions);

                StackExchange.using("externalEditor", function()
                // Have to fire editor after snippets, if snippets enabled
                if (StackExchange.settings.snippets.snippetsEnabled)
                StackExchange.using("snippets", function()
                createEditor();
                );

                else
                createEditor();

                );

                function createEditor()
                StackExchange.prepareEditor(
                heartbeatType: 'answer',
                autoActivateHeartbeat: false,
                convertImagesToLinks: false,
                noModals: true,
                showLowRepImageUploadWarning: true,
                reputationToPostImages: null,
                bindNavPrevention: true,
                postfix: "",
                imageUploader:
                brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                allowUrls: true
                ,
                onDemand: true,
                discardSelector: ".discard-answer"
                ,immediatelyShowMarkdownHelp:true
                );



                );













                draft saved

                draft discarded


















                StackExchange.ready(
                function ()
                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f355613%2fcopy-files-to-aws-s3-bucket-using-ansible%23new-answer', 'question_page');

                );

                Post as a guest















                Required, but never shown

























                4 Answers
                4






                active

                oldest

                votes








                4 Answers
                4






                active

                oldest

                votes









                active

                oldest

                votes






                active

                oldest

                votes









                2














                Module name (s3) should be at the same indentation level as the name:



                - name: Copy to s3
                s3:
                aws_access_key: " lookup('env','aws_key') "
                aws_secret_key: " lookup('env','aws_secret') "
                bucket: " aws_packages_bucket "
                object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                mode: get
                overwrite: no





                share|improve this answer























                • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                  – Nullpointer
                  Apr 4 '17 at 4:33












                • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                  – Marko Živanović
                  Apr 4 '17 at 9:20











                • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                  – Nullpointer
                  Apr 4 '17 at 9:25












                • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                  – Marko Živanović
                  Apr 4 '17 at 9:29















                2














                Module name (s3) should be at the same indentation level as the name:



                - name: Copy to s3
                s3:
                aws_access_key: " lookup('env','aws_key') "
                aws_secret_key: " lookup('env','aws_secret') "
                bucket: " aws_packages_bucket "
                object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                mode: get
                overwrite: no





                share|improve this answer























                • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                  – Nullpointer
                  Apr 4 '17 at 4:33












                • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                  – Marko Živanović
                  Apr 4 '17 at 9:20











                • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                  – Nullpointer
                  Apr 4 '17 at 9:25












                • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                  – Marko Živanović
                  Apr 4 '17 at 9:29













                2












                2








                2







                Module name (s3) should be at the same indentation level as the name:



                - name: Copy to s3
                s3:
                aws_access_key: " lookup('env','aws_key') "
                aws_secret_key: " lookup('env','aws_secret') "
                bucket: " aws_packages_bucket "
                object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                mode: get
                overwrite: no





                share|improve this answer













                Module name (s3) should be at the same indentation level as the name:



                - name: Copy to s3
                s3:
                aws_access_key: " lookup('env','aws_key') "
                aws_secret_key: " lookup('env','aws_secret') "
                bucket: " aws_packages_bucket "
                object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                mode: get
                overwrite: no






                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Apr 3 '17 at 14:20









                Marko ŽivanovićMarko Živanović

                1814




                1814












                • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                  – Nullpointer
                  Apr 4 '17 at 4:33












                • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                  – Marko Živanović
                  Apr 4 '17 at 9:20











                • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                  – Nullpointer
                  Apr 4 '17 at 9:25












                • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                  – Marko Živanović
                  Apr 4 '17 at 9:29

















                • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                  – Nullpointer
                  Apr 4 '17 at 4:33












                • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                  – Marko Živanović
                  Apr 4 '17 at 9:20











                • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                  – Nullpointer
                  Apr 4 '17 at 9:25












                • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                  – Marko Živanović
                  Apr 4 '17 at 9:29
















                I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                – Nullpointer
                Apr 4 '17 at 4:33






                I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                – Nullpointer
                Apr 4 '17 at 4:33














                If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                – Marko Živanović
                Apr 4 '17 at 9:20





                If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                – Marko Živanović
                Apr 4 '17 at 9:20













                I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                – Nullpointer
                Apr 4 '17 at 9:25






                I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                – Nullpointer
                Apr 4 '17 at 9:25














                Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                – Marko Živanović
                Apr 4 '17 at 9:29





                Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                – Marko Živanović
                Apr 4 '17 at 9:29













                0














                To Copy Object from Local Server to S3 using Ansible module, Use



                mode: put


                get will be used to download the object.



                Reference






                share|improve this answer





























                  0














                  To Copy Object from Local Server to S3 using Ansible module, Use



                  mode: put


                  get will be used to download the object.



                  Reference






                  share|improve this answer



























                    0












                    0








                    0







                    To Copy Object from Local Server to S3 using Ansible module, Use



                    mode: put


                    get will be used to download the object.



                    Reference






                    share|improve this answer















                    To Copy Object from Local Server to S3 using Ansible module, Use



                    mode: put


                    get will be used to download the object.



                    Reference







                    share|improve this answer














                    share|improve this answer



                    share|improve this answer








                    edited Aug 11 '17 at 13:00









                    Stephen Rauch

                    3,354101529




                    3,354101529










                    answered Aug 11 '17 at 12:40









                    Naveen KumarNaveen Kumar

                    11




                    11





















                        0














                        I had a similar issue when using aws_s3, the replacement module for s3.



                        Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



                        I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



                        To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



                        13:20 $ python
                        Python 2.7.12 (default, Nov 19 2016, 06:48:10)
                        [GCC 5.4.0 20160609] on linux2
                        Type "help", "copyright", "credits" or "license" for more information.
                        >>> import boto
                        >>> import boto3
                        >>>
                        13:21 $ python3
                        Python 3.5.2 (default, Sep 14 2017, 22:51:06)
                        [GCC 5.4.0 20160609] on linux
                        Type "help", "copyright", "credits" or "license" for more information.
                        >>> import boto
                        >>> import boto3
                        >>>


                        If you receive an error in python then you get the error in Ansible.






                        share|improve this answer



























                          0














                          I had a similar issue when using aws_s3, the replacement module for s3.



                          Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



                          I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



                          To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



                          13:20 $ python
                          Python 2.7.12 (default, Nov 19 2016, 06:48:10)
                          [GCC 5.4.0 20160609] on linux2
                          Type "help", "copyright", "credits" or "license" for more information.
                          >>> import boto
                          >>> import boto3
                          >>>
                          13:21 $ python3
                          Python 3.5.2 (default, Sep 14 2017, 22:51:06)
                          [GCC 5.4.0 20160609] on linux
                          Type "help", "copyright", "credits" or "license" for more information.
                          >>> import boto
                          >>> import boto3
                          >>>


                          If you receive an error in python then you get the error in Ansible.






                          share|improve this answer

























                            0












                            0








                            0







                            I had a similar issue when using aws_s3, the replacement module for s3.



                            Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



                            I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



                            To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



                            13:20 $ python
                            Python 2.7.12 (default, Nov 19 2016, 06:48:10)
                            [GCC 5.4.0 20160609] on linux2
                            Type "help", "copyright", "credits" or "license" for more information.
                            >>> import boto
                            >>> import boto3
                            >>>
                            13:21 $ python3
                            Python 3.5.2 (default, Sep 14 2017, 22:51:06)
                            [GCC 5.4.0 20160609] on linux
                            Type "help", "copyright", "credits" or "license" for more information.
                            >>> import boto
                            >>> import boto3
                            >>>


                            If you receive an error in python then you get the error in Ansible.






                            share|improve this answer













                            I had a similar issue when using aws_s3, the replacement module for s3.



                            Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



                            I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



                            To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



                            13:20 $ python
                            Python 2.7.12 (default, Nov 19 2016, 06:48:10)
                            [GCC 5.4.0 20160609] on linux2
                            Type "help", "copyright", "credits" or "license" for more information.
                            >>> import boto
                            >>> import boto3
                            >>>
                            13:21 $ python3
                            Python 3.5.2 (default, Sep 14 2017, 22:51:06)
                            [GCC 5.4.0 20160609] on linux
                            Type "help", "copyright", "credits" or "license" for more information.
                            >>> import boto
                            >>> import boto3
                            >>>


                            If you receive an error in python then you get the error in Ansible.







                            share|improve this answer












                            share|improve this answer



                            share|improve this answer










                            answered Oct 28 '17 at 12:47









                            Norm1710Norm1710

                            1




                            1





















                                0














                                Add task: above - name like this:



                                ---
                                - hosts: localhost
                                tasks:
                                - name: Copy to s3
                                s3:
                                aws_access_key: " lookup('env','aws_key') "
                                aws_secret_key: " lookup('env','aws_secret') "
                                bucket: " aws_packages_bucket "
                                object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                mode: put
                                overwrite: no





                                share|improve this answer





























                                  0














                                  Add task: above - name like this:



                                  ---
                                  - hosts: localhost
                                  tasks:
                                  - name: Copy to s3
                                  s3:
                                  aws_access_key: " lookup('env','aws_key') "
                                  aws_secret_key: " lookup('env','aws_secret') "
                                  bucket: " aws_packages_bucket "
                                  object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                  dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                  mode: put
                                  overwrite: no





                                  share|improve this answer



























                                    0












                                    0








                                    0







                                    Add task: above - name like this:



                                    ---
                                    - hosts: localhost
                                    tasks:
                                    - name: Copy to s3
                                    s3:
                                    aws_access_key: " lookup('env','aws_key') "
                                    aws_secret_key: " lookup('env','aws_secret') "
                                    bucket: " aws_packages_bucket "
                                    object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                    dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                    mode: put
                                    overwrite: no





                                    share|improve this answer















                                    Add task: above - name like this:



                                    ---
                                    - hosts: localhost
                                    tasks:
                                    - name: Copy to s3
                                    s3:
                                    aws_access_key: " lookup('env','aws_key') "
                                    aws_secret_key: " lookup('env','aws_secret') "
                                    bucket: " aws_packages_bucket "
                                    object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                    dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                    mode: put
                                    overwrite: no






                                    share|improve this answer














                                    share|improve this answer



                                    share|improve this answer








                                    edited Mar 15 at 10:02









                                    Stephen Kitt

                                    181k25414493




                                    181k25414493










                                    answered Mar 15 at 9:42









                                    AnubhavAnubhav

                                    1




                                    1



























                                        draft saved

                                        draft discarded
















































                                        Thanks for contributing an answer to Unix & Linux Stack Exchange!


                                        • Please be sure to answer the question. Provide details and share your research!

                                        But avoid


                                        • Asking for help, clarification, or responding to other answers.

                                        • Making statements based on opinion; back them up with references or personal experience.

                                        To learn more, see our tips on writing great answers.




                                        draft saved


                                        draft discarded














                                        StackExchange.ready(
                                        function ()
                                        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f355613%2fcopy-files-to-aws-s3-bucket-using-ansible%23new-answer', 'question_page');

                                        );

                                        Post as a guest















                                        Required, but never shown





















































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown

































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown






                                        Popular posts from this blog

                                        How to check contact read email or not when send email to Individual?

                                        Bahrain

                                        Postfix configuration issue with fips on centos 7; mailgun relay