Can I wget or download files that are not linked to from a server?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












I have a url such as myurl.com/contents and I am trying to fetch all contents in the folder.



Some files, such as myurl.com/contents/unlisted-file.1 are not linked to anywhere. These files are otherwise accessible; if I wgetthem
explictly or type the address in the browser, I can see them.



Now, I am doing wget myurl.com/contents -r -np to fetch all the contents of the folder, but it is not downloading the unlisted files.



If my understanding is correct, this is by design - wget only downloads files that are linked to.



Is there any way to download all files - linked and unlinked - in a server folder, whether it's through wget or some other tool?







share|improve this question




















  • If opening myurl.com/contents lists all the files in that folder then only we can find a way to download them
    – Arpit Agarwal
    Dec 22 '17 at 9:22










  • You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
    – ridgy
    Dec 22 '17 at 17:13










  • HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way for wget to get the names of other files if they're not linked from somwhere.
    – Barmar
    Dec 22 '17 at 19:32










  • You could use something like scp or rsync, assuming you have SSH access to the server.
    – Barmar
    Dec 22 '17 at 19:33














up vote
1
down vote

favorite












I have a url such as myurl.com/contents and I am trying to fetch all contents in the folder.



Some files, such as myurl.com/contents/unlisted-file.1 are not linked to anywhere. These files are otherwise accessible; if I wgetthem
explictly or type the address in the browser, I can see them.



Now, I am doing wget myurl.com/contents -r -np to fetch all the contents of the folder, but it is not downloading the unlisted files.



If my understanding is correct, this is by design - wget only downloads files that are linked to.



Is there any way to download all files - linked and unlinked - in a server folder, whether it's through wget or some other tool?







share|improve this question




















  • If opening myurl.com/contents lists all the files in that folder then only we can find a way to download them
    – Arpit Agarwal
    Dec 22 '17 at 9:22










  • You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
    – ridgy
    Dec 22 '17 at 17:13










  • HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way for wget to get the names of other files if they're not linked from somwhere.
    – Barmar
    Dec 22 '17 at 19:32










  • You could use something like scp or rsync, assuming you have SSH access to the server.
    – Barmar
    Dec 22 '17 at 19:33












up vote
1
down vote

favorite









up vote
1
down vote

favorite











I have a url such as myurl.com/contents and I am trying to fetch all contents in the folder.



Some files, such as myurl.com/contents/unlisted-file.1 are not linked to anywhere. These files are otherwise accessible; if I wgetthem
explictly or type the address in the browser, I can see them.



Now, I am doing wget myurl.com/contents -r -np to fetch all the contents of the folder, but it is not downloading the unlisted files.



If my understanding is correct, this is by design - wget only downloads files that are linked to.



Is there any way to download all files - linked and unlinked - in a server folder, whether it's through wget or some other tool?







share|improve this question












I have a url such as myurl.com/contents and I am trying to fetch all contents in the folder.



Some files, such as myurl.com/contents/unlisted-file.1 are not linked to anywhere. These files are otherwise accessible; if I wgetthem
explictly or type the address in the browser, I can see them.



Now, I am doing wget myurl.com/contents -r -np to fetch all the contents of the folder, but it is not downloading the unlisted files.



If my understanding is correct, this is by design - wget only downloads files that are linked to.



Is there any way to download all files - linked and unlinked - in a server folder, whether it's through wget or some other tool?









share|improve this question











share|improve this question




share|improve this question










asked Dec 22 '17 at 7:45









alexcs

61




61











  • If opening myurl.com/contents lists all the files in that folder then only we can find a way to download them
    – Arpit Agarwal
    Dec 22 '17 at 9:22










  • You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
    – ridgy
    Dec 22 '17 at 17:13










  • HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way for wget to get the names of other files if they're not linked from somwhere.
    – Barmar
    Dec 22 '17 at 19:32










  • You could use something like scp or rsync, assuming you have SSH access to the server.
    – Barmar
    Dec 22 '17 at 19:33
















  • If opening myurl.com/contents lists all the files in that folder then only we can find a way to download them
    – Arpit Agarwal
    Dec 22 '17 at 9:22










  • You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
    – ridgy
    Dec 22 '17 at 17:13










  • HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way for wget to get the names of other files if they're not linked from somwhere.
    – Barmar
    Dec 22 '17 at 19:32










  • You could use something like scp or rsync, assuming you have SSH access to the server.
    – Barmar
    Dec 22 '17 at 19:33















If opening myurl.com/contents lists all the files in that folder then only we can find a way to download them
– Arpit Agarwal
Dec 22 '17 at 9:22




If opening myurl.com/contents lists all the files in that folder then only we can find a way to download them
– Arpit Agarwal
Dec 22 '17 at 9:22












You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
– ridgy
Dec 22 '17 at 17:13




You cannot access files that are not linked to and from which you don't know the complete path. This is intentionally by the design of http - if the server don't create a directory listing (has to be configured to do so), the contents will be 'hidden'.
– ridgy
Dec 22 '17 at 17:13












HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way for wget to get the names of other files if they're not linked from somwhere.
– Barmar
Dec 22 '17 at 19:32




HTTP doesn't have any concept of directories, and doesn't provide a way to list all the files in a directory. So there's no way for wget to get the names of other files if they're not linked from somwhere.
– Barmar
Dec 22 '17 at 19:32












You could use something like scp or rsync, assuming you have SSH access to the server.
– Barmar
Dec 22 '17 at 19:33




You could use something like scp or rsync, assuming you have SSH access to the server.
– Barmar
Dec 22 '17 at 19:33















active

oldest

votes











Your Answer







StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "106"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: false,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f412443%2fcan-i-wget-or-download-files-that-are-not-linked-to-from-a-server%23new-answer', 'question_page');

);

Post as a guest



































active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes










 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f412443%2fcan-i-wget-or-download-files-that-are-not-linked-to-from-a-server%23new-answer', 'question_page');

);

Post as a guest













































































Popular posts from this blog

How to check contact read email or not when send email to Individual?

Bahrain

Postfix configuration issue with fips on centos 7; mailgun relay