Can I wget or download files that are not linked to from a server?
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
I have a url such as myurl.com/contents
and I am trying to fetch all contents in the folder.
Some files, such as myurl.com/contents/unlisted-file.1
are not linked to anywhere. These files are otherwise accessible; if I wget
them
explictly or type the address in the browser, I can see them.
Now, I am doing wget myurl.com/contents -r -np
to fetch all the contents of the folder, but it is not downloading the unlisted files.
If my understanding is correct, this is by design - wget
only downloads files that are linked to.
Is there any way to download all files - linked and unlinked - in a server folder, whether it's through wget
or some other tool?
wget curl
add a comment |Â
up vote
1
down vote
favorite
I have a url such as myurl.com/contents
and I am trying to fetch all contents in the folder.
Some files, such as myurl.com/contents/unlisted-file.1
are not linked to anywhere. These files are otherwise accessible; if I wget
them
explictly or type the address in the browser, I can see them.
Now, I am doing wget myurl.com/contents -r -np
to fetch all the contents of the folder, but it is not downloading the unlisted files.
If my understanding is correct, this is by design - wget
only downloads files that are linked to.
Is there any way to download all files - linked and unlinked - in a server folder, whether it's through wget
or some other tool?
wget curl
If openingmyurl.com/contents
lists all the files in that folder then only we can find a way to download them
â Arpit Agarwal
Dec 22 '17 at 9:22
You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
â ridgy
Dec 22 '17 at 17:13
HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way forwget
to get the names of other files if they're not linked from somwhere.
â Barmar
Dec 22 '17 at 19:32
You could use something likescp
orrsync
, assuming you have SSH access to the server.
â Barmar
Dec 22 '17 at 19:33
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I have a url such as myurl.com/contents
and I am trying to fetch all contents in the folder.
Some files, such as myurl.com/contents/unlisted-file.1
are not linked to anywhere. These files are otherwise accessible; if I wget
them
explictly or type the address in the browser, I can see them.
Now, I am doing wget myurl.com/contents -r -np
to fetch all the contents of the folder, but it is not downloading the unlisted files.
If my understanding is correct, this is by design - wget
only downloads files that are linked to.
Is there any way to download all files - linked and unlinked - in a server folder, whether it's through wget
or some other tool?
wget curl
I have a url such as myurl.com/contents
and I am trying to fetch all contents in the folder.
Some files, such as myurl.com/contents/unlisted-file.1
are not linked to anywhere. These files are otherwise accessible; if I wget
them
explictly or type the address in the browser, I can see them.
Now, I am doing wget myurl.com/contents -r -np
to fetch all the contents of the folder, but it is not downloading the unlisted files.
If my understanding is correct, this is by design - wget
only downloads files that are linked to.
Is there any way to download all files - linked and unlinked - in a server folder, whether it's through wget
or some other tool?
wget curl
asked Dec 22 '17 at 7:45
alexcs
61
61
If openingmyurl.com/contents
lists all the files in that folder then only we can find a way to download them
â Arpit Agarwal
Dec 22 '17 at 9:22
You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
â ridgy
Dec 22 '17 at 17:13
HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way forwget
to get the names of other files if they're not linked from somwhere.
â Barmar
Dec 22 '17 at 19:32
You could use something likescp
orrsync
, assuming you have SSH access to the server.
â Barmar
Dec 22 '17 at 19:33
add a comment |Â
If openingmyurl.com/contents
lists all the files in that folder then only we can find a way to download them
â Arpit Agarwal
Dec 22 '17 at 9:22
You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
â ridgy
Dec 22 '17 at 17:13
HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way forwget
to get the names of other files if they're not linked from somwhere.
â Barmar
Dec 22 '17 at 19:32
You could use something likescp
orrsync
, assuming you have SSH access to the server.
â Barmar
Dec 22 '17 at 19:33
If opening
myurl.com/contents
lists all the files in that folder then only we can find a way to download themâ Arpit Agarwal
Dec 22 '17 at 9:22
If opening
myurl.com/contents
lists all the files in that folder then only we can find a way to download themâ Arpit Agarwal
Dec 22 '17 at 9:22
You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
â ridgy
Dec 22 '17 at 17:13
You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
â ridgy
Dec 22 '17 at 17:13
HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way for
wget
to get the names of other files if they're not linked from somwhere.â Barmar
Dec 22 '17 at 19:32
HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way for
wget
to get the names of other files if they're not linked from somwhere.â Barmar
Dec 22 '17 at 19:32
You could use something like
scp
or rsync
, assuming you have SSH access to the server.â Barmar
Dec 22 '17 at 19:33
You could use something like
scp
or rsync
, assuming you have SSH access to the server.â Barmar
Dec 22 '17 at 19:33
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f412443%2fcan-i-wget-or-download-files-that-are-not-linked-to-from-a-server%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
If opening
myurl.com/contents
lists all the files in that folder then only we can find a way to download themâ Arpit Agarwal
Dec 22 '17 at 9:22
You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
â ridgy
Dec 22 '17 at 17:13
HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way for
wget
to get the names of other files if they're not linked from somwhere.â Barmar
Dec 22 '17 at 19:32
You could use something like
scp
orrsync
, assuming you have SSH access to the server.â Barmar
Dec 22 '17 at 19:33