How far do radio waves travel?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP












4












$begingroup$


I have read that radio waves just keep travel but the signal gets weaker because of how the wave spreads. Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones. If GPS modules can pick up waves from space why cant my laptop reach my Wifi Signal from far away? And why are Walkie-talkies not able to transmit and receive at much further distances? Do all radio waves potentially travel the same distance? Or does the distance depend on the power of signal? What determines the power of signal, wavelength or frequency?










share|improve this question











$endgroup$











  • $begingroup$
    "regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
    $endgroup$
    – Marcus Müller
    Jan 25 at 12:12
















4












$begingroup$


I have read that radio waves just keep travel but the signal gets weaker because of how the wave spreads. Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones. If GPS modules can pick up waves from space why cant my laptop reach my Wifi Signal from far away? And why are Walkie-talkies not able to transmit and receive at much further distances? Do all radio waves potentially travel the same distance? Or does the distance depend on the power of signal? What determines the power of signal, wavelength or frequency?










share|improve this question











$endgroup$











  • $begingroup$
    "regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
    $endgroup$
    – Marcus Müller
    Jan 25 at 12:12














4












4








4





$begingroup$


I have read that radio waves just keep travel but the signal gets weaker because of how the wave spreads. Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones. If GPS modules can pick up waves from space why cant my laptop reach my Wifi Signal from far away? And why are Walkie-talkies not able to transmit and receive at much further distances? Do all radio waves potentially travel the same distance? Or does the distance depend on the power of signal? What determines the power of signal, wavelength or frequency?










share|improve this question











$endgroup$




I have read that radio waves just keep travel but the signal gets weaker because of how the wave spreads. Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones. If GPS modules can pick up waves from space why cant my laptop reach my Wifi Signal from far away? And why are Walkie-talkies not able to transmit and receive at much further distances? Do all radio waves potentially travel the same distance? Or does the distance depend on the power of signal? What determines the power of signal, wavelength or frequency?







propagation physics






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jan 24 at 18:08









Kevin Reid AG6YO

15.9k33069




15.9k33069










asked Jan 24 at 17:42









z Eyelandz Eyeland

212




212











  • $begingroup$
    "regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
    $endgroup$
    – Marcus Müller
    Jan 25 at 12:12

















  • $begingroup$
    "regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
    $endgroup$
    – Marcus Müller
    Jan 25 at 12:12
















$begingroup$
"regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
$endgroup$
– Marcus Müller
Jan 25 at 12:12





$begingroup$
"regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
$endgroup$
– Marcus Müller
Jan 25 at 12:12











2 Answers
2






active

oldest

votes


















12












$begingroup$

Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.



Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.




Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones




There are several factors here, including:



  • The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.

  • (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.

  • GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.

  • Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.


Do all radio waves potentially travel the same distance?




There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.



Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!




Or does the distance depend on the power of signal?




If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.




What determines the power of signal, wavelength or frequency?




Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).



If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.



(Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = fraccf$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)






share|improve this answer









$endgroup$












  • $begingroup$
    Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
    $endgroup$
    – z Eyeland
    Jan 24 at 19:19







  • 1




    $begingroup$
    @zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
    $endgroup$
    – Kevin Reid AG6YO
    Jan 24 at 19:27










  • $begingroup$
    Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
    $endgroup$
    – z Eyeland
    Jan 24 at 21:44






  • 1




    $begingroup$
    @zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
    $endgroup$
    – Kevin Reid AG6YO
    Jan 24 at 22:07











  • $begingroup$
    The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
    $endgroup$
    – Cecil - W5DXP
    Jan 26 at 19:27


















3












$begingroup$

The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.



And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.






share|improve this answer









$endgroup$












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("schematics", function ()
    StackExchange.schematics.init();
    );
    , "cicuitlab");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "520"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fham.stackexchange.com%2fquestions%2f12690%2fhow-far-do-radio-waves-travel%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    12












    $begingroup$

    Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.



    Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.




    Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones




    There are several factors here, including:



    • The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.

    • (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.

    • GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.

    • Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.


    Do all radio waves potentially travel the same distance?




    There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.



    Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!




    Or does the distance depend on the power of signal?




    If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.




    What determines the power of signal, wavelength or frequency?




    Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).



    If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.



    (Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = fraccf$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)






    share|improve this answer









    $endgroup$












    • $begingroup$
      Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
      $endgroup$
      – z Eyeland
      Jan 24 at 19:19







    • 1




      $begingroup$
      @zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
      $endgroup$
      – Kevin Reid AG6YO
      Jan 24 at 19:27










    • $begingroup$
      Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
      $endgroup$
      – z Eyeland
      Jan 24 at 21:44






    • 1




      $begingroup$
      @zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
      $endgroup$
      – Kevin Reid AG6YO
      Jan 24 at 22:07











    • $begingroup$
      The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
      $endgroup$
      – Cecil - W5DXP
      Jan 26 at 19:27















    12












    $begingroup$

    Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.



    Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.




    Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones




    There are several factors here, including:



    • The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.

    • (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.

    • GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.

    • Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.


    Do all radio waves potentially travel the same distance?




    There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.



    Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!




    Or does the distance depend on the power of signal?




    If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.




    What determines the power of signal, wavelength or frequency?




    Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).



    If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.



    (Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = fraccf$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)






    share|improve this answer









    $endgroup$












    • $begingroup$
      Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
      $endgroup$
      – z Eyeland
      Jan 24 at 19:19







    • 1




      $begingroup$
      @zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
      $endgroup$
      – Kevin Reid AG6YO
      Jan 24 at 19:27










    • $begingroup$
      Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
      $endgroup$
      – z Eyeland
      Jan 24 at 21:44






    • 1




      $begingroup$
      @zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
      $endgroup$
      – Kevin Reid AG6YO
      Jan 24 at 22:07











    • $begingroup$
      The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
      $endgroup$
      – Cecil - W5DXP
      Jan 26 at 19:27













    12












    12








    12





    $begingroup$

    Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.



    Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.




    Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones




    There are several factors here, including:



    • The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.

    • (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.

    • GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.

    • Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.


    Do all radio waves potentially travel the same distance?




    There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.



    Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!




    Or does the distance depend on the power of signal?




    If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.




    What determines the power of signal, wavelength or frequency?




    Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).



    If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.



    (Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = fraccf$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)






    share|improve this answer









    $endgroup$



    Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.



    Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.




    Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones




    There are several factors here, including:



    • The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.

    • (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.

    • GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.

    • Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.


    Do all radio waves potentially travel the same distance?




    There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.



    Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!




    Or does the distance depend on the power of signal?




    If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.




    What determines the power of signal, wavelength or frequency?




    Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).



    If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.



    (Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = fraccf$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)







    share|improve this answer












    share|improve this answer



    share|improve this answer










    answered Jan 24 at 18:37









    Kevin Reid AG6YOKevin Reid AG6YO

    15.9k33069




    15.9k33069











    • $begingroup$
      Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
      $endgroup$
      – z Eyeland
      Jan 24 at 19:19







    • 1




      $begingroup$
      @zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
      $endgroup$
      – Kevin Reid AG6YO
      Jan 24 at 19:27










    • $begingroup$
      Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
      $endgroup$
      – z Eyeland
      Jan 24 at 21:44






    • 1




      $begingroup$
      @zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
      $endgroup$
      – Kevin Reid AG6YO
      Jan 24 at 22:07











    • $begingroup$
      The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
      $endgroup$
      – Cecil - W5DXP
      Jan 26 at 19:27
















    • $begingroup$
      Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
      $endgroup$
      – z Eyeland
      Jan 24 at 19:19







    • 1




      $begingroup$
      @zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
      $endgroup$
      – Kevin Reid AG6YO
      Jan 24 at 19:27










    • $begingroup$
      Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
      $endgroup$
      – z Eyeland
      Jan 24 at 21:44






    • 1




      $begingroup$
      @zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
      $endgroup$
      – Kevin Reid AG6YO
      Jan 24 at 22:07











    • $begingroup$
      The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
      $endgroup$
      – Cecil - W5DXP
      Jan 26 at 19:27















    $begingroup$
    Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
    $endgroup$
    – z Eyeland
    Jan 24 at 19:19





    $begingroup$
    Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
    $endgroup$
    – z Eyeland
    Jan 24 at 19:19





    1




    1




    $begingroup$
    @zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
    $endgroup$
    – Kevin Reid AG6YO
    Jan 24 at 19:27




    $begingroup$
    @zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
    $endgroup$
    – Kevin Reid AG6YO
    Jan 24 at 19:27












    $begingroup$
    Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
    $endgroup$
    – z Eyeland
    Jan 24 at 21:44




    $begingroup$
    Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
    $endgroup$
    – z Eyeland
    Jan 24 at 21:44




    1




    1




    $begingroup$
    @zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
    $endgroup$
    – Kevin Reid AG6YO
    Jan 24 at 22:07





    $begingroup$
    @zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
    $endgroup$
    – Kevin Reid AG6YO
    Jan 24 at 22:07













    $begingroup$
    The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
    $endgroup$
    – Cecil - W5DXP
    Jan 26 at 19:27




    $begingroup$
    The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
    $endgroup$
    – Cecil - W5DXP
    Jan 26 at 19:27











    3












    $begingroup$

    The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.



    And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.






    share|improve this answer









    $endgroup$

















      3












      $begingroup$

      The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.



      And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.






      share|improve this answer









      $endgroup$















        3












        3








        3





        $begingroup$

        The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.



        And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.






        share|improve this answer









        $endgroup$



        The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.



        And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Jan 28 at 16:38









        hotpaw2hotpaw2

        3,12821733




        3,12821733



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Amateur Radio Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fham.stackexchange.com%2fquestions%2f12690%2fhow-far-do-radio-waves-travel%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown






            Popular posts from this blog

            How to check contact read email or not when send email to Individual?

            Displaying single band from multi-band raster using QGIS

            How many registers does an x86_64 CPU actually have?