Copy files to aws s3 bucket using Ansible
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
My plan is to copy file from ec2
to s3
bucket using ansible, Here I've make playbook but getting some error:
copy2s3.yml
---
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no
Getting below error:
$ ansible-playbook copy2s3.yml -i 172.18.2.12,
ERROR! 's3' is not a valid attribute for a Play
The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
---
- name: Copy to s3
^ here
aws ansible
add a comment |
My plan is to copy file from ec2
to s3
bucket using ansible, Here I've make playbook but getting some error:
copy2s3.yml
---
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no
Getting below error:
$ ansible-playbook copy2s3.yml -i 172.18.2.12,
ERROR! 's3' is not a valid attribute for a Play
The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
---
- name: Copy to s3
^ here
aws ansible
add a comment |
My plan is to copy file from ec2
to s3
bucket using ansible, Here I've make playbook but getting some error:
copy2s3.yml
---
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no
Getting below error:
$ ansible-playbook copy2s3.yml -i 172.18.2.12,
ERROR! 's3' is not a valid attribute for a Play
The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
---
- name: Copy to s3
^ here
aws ansible
My plan is to copy file from ec2
to s3
bucket using ansible, Here I've make playbook but getting some error:
copy2s3.yml
---
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no
Getting below error:
$ ansible-playbook copy2s3.yml -i 172.18.2.12,
ERROR! 's3' is not a valid attribute for a Play
The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
---
- name: Copy to s3
^ here
aws ansible
aws ansible
asked Apr 3 '17 at 13:27
NullpointerNullpointer
2681416
2681416
add a comment |
add a comment |
4 Answers
4
active
oldest
votes
Module name (s3) should be at the same indentation level as the name
:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no
I use same but still getting same error, showingYAML
syntax ok !, Is need to install s3 plugins in ansible?
– Nullpointer
Apr 4 '17 at 4:33
If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries:pip install -U ansible boto
.
– Marko Živanović
Apr 4 '17 at 9:20
I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works
– Nullpointer
Apr 4 '17 at 9:25
Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.
– Marko Živanović
Apr 4 '17 at 9:29
add a comment |
To Copy Object from Local Server to S3 using Ansible module, Use
mode: put
get
will be used to download the object.
Reference
add a comment |
I had a similar issue when using aws_s3, the replacement module for s3.
Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.
I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.
To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:
13:20 $ python
Python 2.7.12 (default, Nov 19 2016, 06:48:10)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
13:21 $ python3
Python 3.5.2 (default, Sep 14 2017, 22:51:06)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
If you receive an error in python then you get the error in Ansible.
add a comment |
Add task:
above - name
like this:
---
- hosts: localhost
tasks:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: put
overwrite: no
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "106"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f355613%2fcopy-files-to-aws-s3-bucket-using-ansible%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
Module name (s3) should be at the same indentation level as the name
:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no
I use same but still getting same error, showingYAML
syntax ok !, Is need to install s3 plugins in ansible?
– Nullpointer
Apr 4 '17 at 4:33
If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries:pip install -U ansible boto
.
– Marko Živanović
Apr 4 '17 at 9:20
I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works
– Nullpointer
Apr 4 '17 at 9:25
Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.
– Marko Živanović
Apr 4 '17 at 9:29
add a comment |
Module name (s3) should be at the same indentation level as the name
:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no
I use same but still getting same error, showingYAML
syntax ok !, Is need to install s3 plugins in ansible?
– Nullpointer
Apr 4 '17 at 4:33
If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries:pip install -U ansible boto
.
– Marko Živanović
Apr 4 '17 at 9:20
I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works
– Nullpointer
Apr 4 '17 at 9:25
Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.
– Marko Živanović
Apr 4 '17 at 9:29
add a comment |
Module name (s3) should be at the same indentation level as the name
:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no
Module name (s3) should be at the same indentation level as the name
:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no
answered Apr 3 '17 at 14:20
Marko ŽivanovićMarko Živanović
1814
1814
I use same but still getting same error, showingYAML
syntax ok !, Is need to install s3 plugins in ansible?
– Nullpointer
Apr 4 '17 at 4:33
If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries:pip install -U ansible boto
.
– Marko Živanović
Apr 4 '17 at 9:20
I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works
– Nullpointer
Apr 4 '17 at 9:25
Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.
– Marko Živanović
Apr 4 '17 at 9:29
add a comment |
I use same but still getting same error, showingYAML
syntax ok !, Is need to install s3 plugins in ansible?
– Nullpointer
Apr 4 '17 at 4:33
If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries:pip install -U ansible boto
.
– Marko Živanović
Apr 4 '17 at 9:20
I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works
– Nullpointer
Apr 4 '17 at 9:25
Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.
– Marko Živanović
Apr 4 '17 at 9:29
I use same but still getting same error, showing
YAML
syntax ok !, Is need to install s3 plugins in ansible?– Nullpointer
Apr 4 '17 at 4:33
I use same but still getting same error, showing
YAML
syntax ok !, Is need to install s3 plugins in ansible?– Nullpointer
Apr 4 '17 at 4:33
If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries:
pip install -U ansible boto
.– Marko Živanović
Apr 4 '17 at 9:20
If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries:
pip install -U ansible boto
.– Marko Živanović
Apr 4 '17 at 9:20
I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works
– Nullpointer
Apr 4 '17 at 9:25
I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works
– Nullpointer
Apr 4 '17 at 9:25
Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.
– Marko Živanović
Apr 4 '17 at 9:29
Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.
– Marko Živanović
Apr 4 '17 at 9:29
add a comment |
To Copy Object from Local Server to S3 using Ansible module, Use
mode: put
get
will be used to download the object.
Reference
add a comment |
To Copy Object from Local Server to S3 using Ansible module, Use
mode: put
get
will be used to download the object.
Reference
add a comment |
To Copy Object from Local Server to S3 using Ansible module, Use
mode: put
get
will be used to download the object.
Reference
To Copy Object from Local Server to S3 using Ansible module, Use
mode: put
get
will be used to download the object.
Reference
edited Aug 11 '17 at 13:00
Stephen Rauch
3,354101529
3,354101529
answered Aug 11 '17 at 12:40
Naveen KumarNaveen Kumar
11
11
add a comment |
add a comment |
I had a similar issue when using aws_s3, the replacement module for s3.
Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.
I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.
To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:
13:20 $ python
Python 2.7.12 (default, Nov 19 2016, 06:48:10)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
13:21 $ python3
Python 3.5.2 (default, Sep 14 2017, 22:51:06)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
If you receive an error in python then you get the error in Ansible.
add a comment |
I had a similar issue when using aws_s3, the replacement module for s3.
Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.
I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.
To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:
13:20 $ python
Python 2.7.12 (default, Nov 19 2016, 06:48:10)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
13:21 $ python3
Python 3.5.2 (default, Sep 14 2017, 22:51:06)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
If you receive an error in python then you get the error in Ansible.
add a comment |
I had a similar issue when using aws_s3, the replacement module for s3.
Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.
I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.
To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:
13:20 $ python
Python 2.7.12 (default, Nov 19 2016, 06:48:10)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
13:21 $ python3
Python 3.5.2 (default, Sep 14 2017, 22:51:06)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
If you receive an error in python then you get the error in Ansible.
I had a similar issue when using aws_s3, the replacement module for s3.
Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.
I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.
To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:
13:20 $ python
Python 2.7.12 (default, Nov 19 2016, 06:48:10)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
13:21 $ python3
Python 3.5.2 (default, Sep 14 2017, 22:51:06)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import boto
>>> import boto3
>>>
If you receive an error in python then you get the error in Ansible.
answered Oct 28 '17 at 12:47
Norm1710Norm1710
1
1
add a comment |
add a comment |
Add task:
above - name
like this:
---
- hosts: localhost
tasks:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: put
overwrite: no
add a comment |
Add task:
above - name
like this:
---
- hosts: localhost
tasks:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: put
overwrite: no
add a comment |
Add task:
above - name
like this:
---
- hosts: localhost
tasks:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: put
overwrite: no
Add task:
above - name
like this:
---
- hosts: localhost
tasks:
- name: Copy to s3
s3:
aws_access_key: " lookup('env','aws_key') "
aws_secret_key: " lookup('env','aws_secret') "
bucket: " aws_packages_bucket "
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: put
overwrite: no
edited Mar 15 at 10:02
Stephen Kitt
181k25414493
181k25414493
answered Mar 15 at 9:42
AnubhavAnubhav
1
1
add a comment |
add a comment |
Thanks for contributing an answer to Unix & Linux Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f355613%2fcopy-files-to-aws-s3-bucket-using-ansible%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown