GET via wget or cURL gives a limited response
Clash Royale CLAN TAG#URR8PPP
I'm bumping my head on a simple HTTP GET request.
Using a tool like Postman, I'm getting the full and expected response (an over 1 thousand lines xml document, displaying the details of my 60 albums)
For this GET request, I provide:
- 4 parameters embedded in the url
- The default header "Content-Type: application/x-www-form-urlencoded"
- 1 cookie containing my session ID, to secure my identity
With using wget however, I get only a 505 lines long xml document. As far as I know, I do provide the same cookies, same parameters and header.
wget -o logfile.log -O response.xml --load-cookies cookies.txt https://app.viewbook.com/albums/?sort=date&order=desc&page=1&per_page=500
The "response.xml" I get is a well-formed xml document, but with only 25 albums, 505 lines long:
I was thinking there might be a built-in limitation within wget, so I tried with curl
Using curl, I get exactly the same 25 limited albums, 505 lines long xml response:
curl -D filelog.log -o response.xml -H "Content-Type: application/x-www-form-urlencoded" -b "_accountservice3_session=ZFp6UU..." https://app.viewbook.com/albums/?sort=date&order=desc&page=1&per_page=500
I'm out of ideas why both wget and curl - while not totally failing - don't give the same complete result I get with the Postman tool.
Am I missing some parameters during my GET call?
Where does this limited length response could come from?
curl wget http
add a comment |
I'm bumping my head on a simple HTTP GET request.
Using a tool like Postman, I'm getting the full and expected response (an over 1 thousand lines xml document, displaying the details of my 60 albums)
For this GET request, I provide:
- 4 parameters embedded in the url
- The default header "Content-Type: application/x-www-form-urlencoded"
- 1 cookie containing my session ID, to secure my identity
With using wget however, I get only a 505 lines long xml document. As far as I know, I do provide the same cookies, same parameters and header.
wget -o logfile.log -O response.xml --load-cookies cookies.txt https://app.viewbook.com/albums/?sort=date&order=desc&page=1&per_page=500
The "response.xml" I get is a well-formed xml document, but with only 25 albums, 505 lines long:
I was thinking there might be a built-in limitation within wget, so I tried with curl
Using curl, I get exactly the same 25 limited albums, 505 lines long xml response:
curl -D filelog.log -o response.xml -H "Content-Type: application/x-www-form-urlencoded" -b "_accountservice3_session=ZFp6UU..." https://app.viewbook.com/albums/?sort=date&order=desc&page=1&per_page=500
I'm out of ideas why both wget and curl - while not totally failing - don't give the same complete result I get with the Postman tool.
Am I missing some parameters during my GET call?
Where does this limited length response could come from?
curl wget http
3
If you 8sed exactly the lines you showed, the ampersands in the unquoted URL are treated as terminating the first command and running the subsequent commands in parallel by all common Unix shells (and also as a terminator but run in serial not parallel in WIndows CMD, but that would give an error for the commandorder=desc
etc).
– dave_thompson_085
Feb 14 at 6:49
@dave_thompson_085 I'm stunned! That fixed it! I forgot the ampersands has a special meaning in bash, and therefore I need to quote the whole url string. Putting the url into double quotes did fix my issue. I now have the 1000+ lines file I was looking for! Thank you so so so much. (you may add your answer so I can accept it). By the way, I'm not sure why I didn't get any error onorder=desc
as you said.
– Yoric
Feb 14 at 7:05
add a comment |
I'm bumping my head on a simple HTTP GET request.
Using a tool like Postman, I'm getting the full and expected response (an over 1 thousand lines xml document, displaying the details of my 60 albums)
For this GET request, I provide:
- 4 parameters embedded in the url
- The default header "Content-Type: application/x-www-form-urlencoded"
- 1 cookie containing my session ID, to secure my identity
With using wget however, I get only a 505 lines long xml document. As far as I know, I do provide the same cookies, same parameters and header.
wget -o logfile.log -O response.xml --load-cookies cookies.txt https://app.viewbook.com/albums/?sort=date&order=desc&page=1&per_page=500
The "response.xml" I get is a well-formed xml document, but with only 25 albums, 505 lines long:
I was thinking there might be a built-in limitation within wget, so I tried with curl
Using curl, I get exactly the same 25 limited albums, 505 lines long xml response:
curl -D filelog.log -o response.xml -H "Content-Type: application/x-www-form-urlencoded" -b "_accountservice3_session=ZFp6UU..." https://app.viewbook.com/albums/?sort=date&order=desc&page=1&per_page=500
I'm out of ideas why both wget and curl - while not totally failing - don't give the same complete result I get with the Postman tool.
Am I missing some parameters during my GET call?
Where does this limited length response could come from?
curl wget http
I'm bumping my head on a simple HTTP GET request.
Using a tool like Postman, I'm getting the full and expected response (an over 1 thousand lines xml document, displaying the details of my 60 albums)
For this GET request, I provide:
- 4 parameters embedded in the url
- The default header "Content-Type: application/x-www-form-urlencoded"
- 1 cookie containing my session ID, to secure my identity
With using wget however, I get only a 505 lines long xml document. As far as I know, I do provide the same cookies, same parameters and header.
wget -o logfile.log -O response.xml --load-cookies cookies.txt https://app.viewbook.com/albums/?sort=date&order=desc&page=1&per_page=500
The "response.xml" I get is a well-formed xml document, but with only 25 albums, 505 lines long:
I was thinking there might be a built-in limitation within wget, so I tried with curl
Using curl, I get exactly the same 25 limited albums, 505 lines long xml response:
curl -D filelog.log -o response.xml -H "Content-Type: application/x-www-form-urlencoded" -b "_accountservice3_session=ZFp6UU..." https://app.viewbook.com/albums/?sort=date&order=desc&page=1&per_page=500
I'm out of ideas why both wget and curl - while not totally failing - don't give the same complete result I get with the Postman tool.
Am I missing some parameters during my GET call?
Where does this limited length response could come from?
curl wget http
curl wget http
edited Feb 14 at 5:33
Yoric
asked Feb 14 at 2:45
YoricYoric
1134
1134
3
If you 8sed exactly the lines you showed, the ampersands in the unquoted URL are treated as terminating the first command and running the subsequent commands in parallel by all common Unix shells (and also as a terminator but run in serial not parallel in WIndows CMD, but that would give an error for the commandorder=desc
etc).
– dave_thompson_085
Feb 14 at 6:49
@dave_thompson_085 I'm stunned! That fixed it! I forgot the ampersands has a special meaning in bash, and therefore I need to quote the whole url string. Putting the url into double quotes did fix my issue. I now have the 1000+ lines file I was looking for! Thank you so so so much. (you may add your answer so I can accept it). By the way, I'm not sure why I didn't get any error onorder=desc
as you said.
– Yoric
Feb 14 at 7:05
add a comment |
3
If you 8sed exactly the lines you showed, the ampersands in the unquoted URL are treated as terminating the first command and running the subsequent commands in parallel by all common Unix shells (and also as a terminator but run in serial not parallel in WIndows CMD, but that would give an error for the commandorder=desc
etc).
– dave_thompson_085
Feb 14 at 6:49
@dave_thompson_085 I'm stunned! That fixed it! I forgot the ampersands has a special meaning in bash, and therefore I need to quote the whole url string. Putting the url into double quotes did fix my issue. I now have the 1000+ lines file I was looking for! Thank you so so so much. (you may add your answer so I can accept it). By the way, I'm not sure why I didn't get any error onorder=desc
as you said.
– Yoric
Feb 14 at 7:05
3
3
If you 8sed exactly the lines you showed, the ampersands in the unquoted URL are treated as terminating the first command and running the subsequent commands in parallel by all common Unix shells (and also as a terminator but run in serial not parallel in WIndows CMD, but that would give an error for the command
order=desc
etc).– dave_thompson_085
Feb 14 at 6:49
If you 8sed exactly the lines you showed, the ampersands in the unquoted URL are treated as terminating the first command and running the subsequent commands in parallel by all common Unix shells (and also as a terminator but run in serial not parallel in WIndows CMD, but that would give an error for the command
order=desc
etc).– dave_thompson_085
Feb 14 at 6:49
@dave_thompson_085 I'm stunned! That fixed it! I forgot the ampersands has a special meaning in bash, and therefore I need to quote the whole url string. Putting the url into double quotes did fix my issue. I now have the 1000+ lines file I was looking for! Thank you so so so much. (you may add your answer so I can accept it). By the way, I'm not sure why I didn't get any error on
order=desc
as you said.– Yoric
Feb 14 at 7:05
@dave_thompson_085 I'm stunned! That fixed it! I forgot the ampersands has a special meaning in bash, and therefore I need to quote the whole url string. Putting the url into double quotes did fix my issue. I now have the 1000+ lines file I was looking for! Thank you so so so much. (you may add your answer so I can accept it). By the way, I'm not sure why I didn't get any error on
order=desc
as you said.– Yoric
Feb 14 at 7:05
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "106"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f500532%2fget-via-wget-or-curl-gives-a-limited-response%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Unix & Linux Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f500532%2fget-via-wget-or-curl-gives-a-limited-response%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
If you 8sed exactly the lines you showed, the ampersands in the unquoted URL are treated as terminating the first command and running the subsequent commands in parallel by all common Unix shells (and also as a terminator but run in serial not parallel in WIndows CMD, but that would give an error for the command
order=desc
etc).– dave_thompson_085
Feb 14 at 6:49
@dave_thompson_085 I'm stunned! That fixed it! I forgot the ampersands has a special meaning in bash, and therefore I need to quote the whole url string. Putting the url into double quotes did fix my issue. I now have the 1000+ lines file I was looking for! Thank you so so so much. (you may add your answer so I can accept it). By the way, I'm not sure why I didn't get any error on
order=desc
as you said.– Yoric
Feb 14 at 7:05