File from wget in script not available for next command
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I am wget-ing a file and to preserve its original file name i do a find
for its known extension.
Everything is working fine on the terminal but in a script the file is never available/there for the find command.
I guess after wget completes the download the file is not yet persisted to the file system ...
How can I wait for the file to be available?
wget -q -N "$BASEURL/$VERSION"
echo $(find . -maxdepth 1 -name *.jar)
wget
add a comment |Â
up vote
0
down vote
favorite
I am wget-ing a file and to preserve its original file name i do a find
for its known extension.
Everything is working fine on the terminal but in a script the file is never available/there for the find command.
I guess after wget completes the download the file is not yet persisted to the file system ...
How can I wait for the file to be available?
wget -q -N "$BASEURL/$VERSION"
echo $(find . -maxdepth 1 -name *.jar)
wget
First, I'd use the-O
option to specify a file path/name to output to - this way, you skip the find step. If the file really isn't there untilwget
has finally finished, then adding asleep 3
(or however many seconds you want) after thewget
and before your next command(s) should do it. Be sure to quote the url you arewget
-ing in case it has ampersands, etc. in it (which would sendwget
to the background and let the script continue, which means the file wouldn't be totally there)
â ivanivan
Mar 6 at 13:20
sleep is working but very ugly, I don't want -O because I want to preserve its original name and the files are behind redirects
â Pali
Mar 6 at 13:22
Can you please paste the script which you tried so far?
â Thushi
Mar 6 at 13:25
@Thushi updated q
â Pali
Mar 6 at 13:36
#!/bin/sh wget -q -N google.com echo $(find . -maxdepth 1 -name *.html)
This works for me! Can you please paste the error text which you are getting?
â Thushi
Mar 7 at 11:56
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am wget-ing a file and to preserve its original file name i do a find
for its known extension.
Everything is working fine on the terminal but in a script the file is never available/there for the find command.
I guess after wget completes the download the file is not yet persisted to the file system ...
How can I wait for the file to be available?
wget -q -N "$BASEURL/$VERSION"
echo $(find . -maxdepth 1 -name *.jar)
wget
I am wget-ing a file and to preserve its original file name i do a find
for its known extension.
Everything is working fine on the terminal but in a script the file is never available/there for the find command.
I guess after wget completes the download the file is not yet persisted to the file system ...
How can I wait for the file to be available?
wget -q -N "$BASEURL/$VERSION"
echo $(find . -maxdepth 1 -name *.jar)
wget
edited Mar 6 at 13:36
asked Mar 6 at 13:12
Pali
1012
1012
First, I'd use the-O
option to specify a file path/name to output to - this way, you skip the find step. If the file really isn't there untilwget
has finally finished, then adding asleep 3
(or however many seconds you want) after thewget
and before your next command(s) should do it. Be sure to quote the url you arewget
-ing in case it has ampersands, etc. in it (which would sendwget
to the background and let the script continue, which means the file wouldn't be totally there)
â ivanivan
Mar 6 at 13:20
sleep is working but very ugly, I don't want -O because I want to preserve its original name and the files are behind redirects
â Pali
Mar 6 at 13:22
Can you please paste the script which you tried so far?
â Thushi
Mar 6 at 13:25
@Thushi updated q
â Pali
Mar 6 at 13:36
#!/bin/sh wget -q -N google.com echo $(find . -maxdepth 1 -name *.html)
This works for me! Can you please paste the error text which you are getting?
â Thushi
Mar 7 at 11:56
add a comment |Â
First, I'd use the-O
option to specify a file path/name to output to - this way, you skip the find step. If the file really isn't there untilwget
has finally finished, then adding asleep 3
(or however many seconds you want) after thewget
and before your next command(s) should do it. Be sure to quote the url you arewget
-ing in case it has ampersands, etc. in it (which would sendwget
to the background and let the script continue, which means the file wouldn't be totally there)
â ivanivan
Mar 6 at 13:20
sleep is working but very ugly, I don't want -O because I want to preserve its original name and the files are behind redirects
â Pali
Mar 6 at 13:22
Can you please paste the script which you tried so far?
â Thushi
Mar 6 at 13:25
@Thushi updated q
â Pali
Mar 6 at 13:36
#!/bin/sh wget -q -N google.com echo $(find . -maxdepth 1 -name *.html)
This works for me! Can you please paste the error text which you are getting?
â Thushi
Mar 7 at 11:56
First, I'd use the
-O
option to specify a file path/name to output to - this way, you skip the find step. If the file really isn't there until wget
has finally finished, then adding a sleep 3
(or however many seconds you want) after the wget
and before your next command(s) should do it. Be sure to quote the url you are wget
-ing in case it has ampersands, etc. in it (which would send wget
to the background and let the script continue, which means the file wouldn't be totally there)â ivanivan
Mar 6 at 13:20
First, I'd use the
-O
option to specify a file path/name to output to - this way, you skip the find step. If the file really isn't there until wget
has finally finished, then adding a sleep 3
(or however many seconds you want) after the wget
and before your next command(s) should do it. Be sure to quote the url you are wget
-ing in case it has ampersands, etc. in it (which would send wget
to the background and let the script continue, which means the file wouldn't be totally there)â ivanivan
Mar 6 at 13:20
sleep is working but very ugly, I don't want -O because I want to preserve its original name and the files are behind redirects
â Pali
Mar 6 at 13:22
sleep is working but very ugly, I don't want -O because I want to preserve its original name and the files are behind redirects
â Pali
Mar 6 at 13:22
Can you please paste the script which you tried so far?
â Thushi
Mar 6 at 13:25
Can you please paste the script which you tried so far?
â Thushi
Mar 6 at 13:25
@Thushi updated q
â Pali
Mar 6 at 13:36
@Thushi updated q
â Pali
Mar 6 at 13:36
#!/bin/sh wget -q -N google.com echo $(find . -maxdepth 1 -name *.html)
This works for me! Can you please paste the error text which you are getting?â Thushi
Mar 7 at 11:56
#!/bin/sh wget -q -N google.com echo $(find . -maxdepth 1 -name *.html)
This works for me! Can you please paste the error text which you are getting?â Thushi
Mar 7 at 11:56
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f428506%2ffile-from-wget-in-script-not-available-for-next-command%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
First, I'd use the
-O
option to specify a file path/name to output to - this way, you skip the find step. If the file really isn't there untilwget
has finally finished, then adding asleep 3
(or however many seconds you want) after thewget
and before your next command(s) should do it. Be sure to quote the url you arewget
-ing in case it has ampersands, etc. in it (which would sendwget
to the background and let the script continue, which means the file wouldn't be totally there)â ivanivan
Mar 6 at 13:20
sleep is working but very ugly, I don't want -O because I want to preserve its original name and the files are behind redirects
â Pali
Mar 6 at 13:22
Can you please paste the script which you tried so far?
â Thushi
Mar 6 at 13:25
@Thushi updated q
â Pali
Mar 6 at 13:36
#!/bin/sh wget -q -N google.com echo $(find . -maxdepth 1 -name *.html)
This works for me! Can you please paste the error text which you are getting?â Thushi
Mar 7 at 11:56