Can we estimate the loss of entropy when applying a N-bit hash function to and N-bit random input? [duplicate]

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP












4












$begingroup$



This question already has an answer here:



  • If a SHA256 hash with high entropy is then hashed with one made from low entropy, is the resulting hash higher/same/lower entropy?

    3 answers



Someone pointed out recently to me that a cryptographic hash function " is not designed as a bijective mapping from N bit input to N bit output".



So if I feed an N-bit cryptographic hash function with N bits of random input, there's a loss of entropy between the input and output of the hash function.



Considering the md5 hash function, is there a way to estimate that loss of entropy? And is this loss cumulative so I could say, if I apply the hash function enough times, I end up with a 50% loss of entropy?










share|improve this question











$endgroup$



marked as duplicate by Squeamish Ossifrage, Maeher, Maarten Bodewes Feb 18 at 15:55


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.






















    4












    $begingroup$



    This question already has an answer here:



    • If a SHA256 hash with high entropy is then hashed with one made from low entropy, is the resulting hash higher/same/lower entropy?

      3 answers



    Someone pointed out recently to me that a cryptographic hash function " is not designed as a bijective mapping from N bit input to N bit output".



    So if I feed an N-bit cryptographic hash function with N bits of random input, there's a loss of entropy between the input and output of the hash function.



    Considering the md5 hash function, is there a way to estimate that loss of entropy? And is this loss cumulative so I could say, if I apply the hash function enough times, I end up with a 50% loss of entropy?










    share|improve this question











    $endgroup$



    marked as duplicate by Squeamish Ossifrage, Maeher, Maarten Bodewes Feb 18 at 15:55


    This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.




















      4












      4








      4





      $begingroup$



      This question already has an answer here:



      • If a SHA256 hash with high entropy is then hashed with one made from low entropy, is the resulting hash higher/same/lower entropy?

        3 answers



      Someone pointed out recently to me that a cryptographic hash function " is not designed as a bijective mapping from N bit input to N bit output".



      So if I feed an N-bit cryptographic hash function with N bits of random input, there's a loss of entropy between the input and output of the hash function.



      Considering the md5 hash function, is there a way to estimate that loss of entropy? And is this loss cumulative so I could say, if I apply the hash function enough times, I end up with a 50% loss of entropy?










      share|improve this question











      $endgroup$





      This question already has an answer here:



      • If a SHA256 hash with high entropy is then hashed with one made from low entropy, is the resulting hash higher/same/lower entropy?

        3 answers



      Someone pointed out recently to me that a cryptographic hash function " is not designed as a bijective mapping from N bit input to N bit output".



      So if I feed an N-bit cryptographic hash function with N bits of random input, there's a loss of entropy between the input and output of the hash function.



      Considering the md5 hash function, is there a way to estimate that loss of entropy? And is this loss cumulative so I could say, if I apply the hash function enough times, I end up with a 50% loss of entropy?





      This question already has an answer here:



      • If a SHA256 hash with high entropy is then hashed with one made from low entropy, is the resulting hash higher/same/lower entropy?

        3 answers







      hash entropy






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Feb 10 at 20:21







      Sylvain Leroux

















      asked Feb 10 at 20:15









      Sylvain LerouxSylvain Leroux

      1235




      1235




      marked as duplicate by Squeamish Ossifrage, Maeher, Maarten Bodewes Feb 18 at 15:55


      This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.









      marked as duplicate by Squeamish Ossifrage, Maeher, Maarten Bodewes Feb 18 at 15:55


      This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.






















          1 Answer
          1






          active

          oldest

          votes


















          4












          $begingroup$

          Actually, no. If it is a good Hash, you should roughly have $N-k$ bits of output entropy for some $k$ of much lower order than $N$.



          The problem arises when the input is much longer than $N$ bits.



          One way to estimate the entropy loss of such a Hash applied to $N$ bit inputs is to model it as a randomly chosen function on $N$ bits.
          This was first done by Odlyzko and Flajolet. There is a nice review with updated results here



          Let $tau_m$ be the image size of the $m$th iterate of the function. The entropy can be related to its behaviour.



          If the function is a permutation, $tau_m=2^N$ for all $mgeq 1$ and there is no entropy loss.



          Edit: See the comment and link by @fgrieu which is an estimate of what I called $tau_1.$ He is saying that
          $$
          tau_1approx 2^128-0.8272cdots
          $$

          for $N=128.$






          share|improve this answer











          $endgroup$








          • 3




            $begingroup$
            Is there any estimate for the $k$?
            $endgroup$
            – kelalaka
            Feb 10 at 21:12






          • 1




            $begingroup$
            Except for small $N$, the $k$ depends very little on $N$, and for $N=128$ is already very close to its asymptotic value $eta=displaystyle1over esum_j=1^inftyj;log_2jover j!;;=0.8272dotstextbit$. See this.
            $endgroup$
            – fgrieu
            Feb 10 at 22:14


















          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          4












          $begingroup$

          Actually, no. If it is a good Hash, you should roughly have $N-k$ bits of output entropy for some $k$ of much lower order than $N$.



          The problem arises when the input is much longer than $N$ bits.



          One way to estimate the entropy loss of such a Hash applied to $N$ bit inputs is to model it as a randomly chosen function on $N$ bits.
          This was first done by Odlyzko and Flajolet. There is a nice review with updated results here



          Let $tau_m$ be the image size of the $m$th iterate of the function. The entropy can be related to its behaviour.



          If the function is a permutation, $tau_m=2^N$ for all $mgeq 1$ and there is no entropy loss.



          Edit: See the comment and link by @fgrieu which is an estimate of what I called $tau_1.$ He is saying that
          $$
          tau_1approx 2^128-0.8272cdots
          $$

          for $N=128.$






          share|improve this answer











          $endgroup$








          • 3




            $begingroup$
            Is there any estimate for the $k$?
            $endgroup$
            – kelalaka
            Feb 10 at 21:12






          • 1




            $begingroup$
            Except for small $N$, the $k$ depends very little on $N$, and for $N=128$ is already very close to its asymptotic value $eta=displaystyle1over esum_j=1^inftyj;log_2jover j!;;=0.8272dotstextbit$. See this.
            $endgroup$
            – fgrieu
            Feb 10 at 22:14
















          4












          $begingroup$

          Actually, no. If it is a good Hash, you should roughly have $N-k$ bits of output entropy for some $k$ of much lower order than $N$.



          The problem arises when the input is much longer than $N$ bits.



          One way to estimate the entropy loss of such a Hash applied to $N$ bit inputs is to model it as a randomly chosen function on $N$ bits.
          This was first done by Odlyzko and Flajolet. There is a nice review with updated results here



          Let $tau_m$ be the image size of the $m$th iterate of the function. The entropy can be related to its behaviour.



          If the function is a permutation, $tau_m=2^N$ for all $mgeq 1$ and there is no entropy loss.



          Edit: See the comment and link by @fgrieu which is an estimate of what I called $tau_1.$ He is saying that
          $$
          tau_1approx 2^128-0.8272cdots
          $$

          for $N=128.$






          share|improve this answer











          $endgroup$








          • 3




            $begingroup$
            Is there any estimate for the $k$?
            $endgroup$
            – kelalaka
            Feb 10 at 21:12






          • 1




            $begingroup$
            Except for small $N$, the $k$ depends very little on $N$, and for $N=128$ is already very close to its asymptotic value $eta=displaystyle1over esum_j=1^inftyj;log_2jover j!;;=0.8272dotstextbit$. See this.
            $endgroup$
            – fgrieu
            Feb 10 at 22:14














          4












          4








          4





          $begingroup$

          Actually, no. If it is a good Hash, you should roughly have $N-k$ bits of output entropy for some $k$ of much lower order than $N$.



          The problem arises when the input is much longer than $N$ bits.



          One way to estimate the entropy loss of such a Hash applied to $N$ bit inputs is to model it as a randomly chosen function on $N$ bits.
          This was first done by Odlyzko and Flajolet. There is a nice review with updated results here



          Let $tau_m$ be the image size of the $m$th iterate of the function. The entropy can be related to its behaviour.



          If the function is a permutation, $tau_m=2^N$ for all $mgeq 1$ and there is no entropy loss.



          Edit: See the comment and link by @fgrieu which is an estimate of what I called $tau_1.$ He is saying that
          $$
          tau_1approx 2^128-0.8272cdots
          $$

          for $N=128.$






          share|improve this answer











          $endgroup$



          Actually, no. If it is a good Hash, you should roughly have $N-k$ bits of output entropy for some $k$ of much lower order than $N$.



          The problem arises when the input is much longer than $N$ bits.



          One way to estimate the entropy loss of such a Hash applied to $N$ bit inputs is to model it as a randomly chosen function on $N$ bits.
          This was first done by Odlyzko and Flajolet. There is a nice review with updated results here



          Let $tau_m$ be the image size of the $m$th iterate of the function. The entropy can be related to its behaviour.



          If the function is a permutation, $tau_m=2^N$ for all $mgeq 1$ and there is no entropy loss.



          Edit: See the comment and link by @fgrieu which is an estimate of what I called $tau_1.$ He is saying that
          $$
          tau_1approx 2^128-0.8272cdots
          $$

          for $N=128.$







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Feb 10 at 22:30

























          answered Feb 10 at 20:50









          kodlukodlu

          9,04011330




          9,04011330







          • 3




            $begingroup$
            Is there any estimate for the $k$?
            $endgroup$
            – kelalaka
            Feb 10 at 21:12






          • 1




            $begingroup$
            Except for small $N$, the $k$ depends very little on $N$, and for $N=128$ is already very close to its asymptotic value $eta=displaystyle1over esum_j=1^inftyj;log_2jover j!;;=0.8272dotstextbit$. See this.
            $endgroup$
            – fgrieu
            Feb 10 at 22:14













          • 3




            $begingroup$
            Is there any estimate for the $k$?
            $endgroup$
            – kelalaka
            Feb 10 at 21:12






          • 1




            $begingroup$
            Except for small $N$, the $k$ depends very little on $N$, and for $N=128$ is already very close to its asymptotic value $eta=displaystyle1over esum_j=1^inftyj;log_2jover j!;;=0.8272dotstextbit$. See this.
            $endgroup$
            – fgrieu
            Feb 10 at 22:14








          3




          3




          $begingroup$
          Is there any estimate for the $k$?
          $endgroup$
          – kelalaka
          Feb 10 at 21:12




          $begingroup$
          Is there any estimate for the $k$?
          $endgroup$
          – kelalaka
          Feb 10 at 21:12




          1




          1




          $begingroup$
          Except for small $N$, the $k$ depends very little on $N$, and for $N=128$ is already very close to its asymptotic value $eta=displaystyle1over esum_j=1^inftyj;log_2jover j!;;=0.8272dotstextbit$. See this.
          $endgroup$
          – fgrieu
          Feb 10 at 22:14





          $begingroup$
          Except for small $N$, the $k$ depends very little on $N$, and for $N=128$ is already very close to its asymptotic value $eta=displaystyle1over esum_j=1^inftyj;log_2jover j!;;=0.8272dotstextbit$. See this.
          $endgroup$
          – fgrieu
          Feb 10 at 22:14



          Popular posts from this blog

          How to check contact read email or not when send email to Individual?

          Displaying single band from multi-band raster using QGIS

          How many registers does an x86_64 CPU actually have?