Bayesian Probability question — Pointwise Probability

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP












1












$begingroup$


I am stuck in this question:



if $a = 1$ then $m sim U(0.2,1)$ else if $a=0$ then $m sim U(0,0.5)$. The question is if $m$ is $0.3$ what is the probability that $a$ equals to 1?



My thought is to compute $p(a=1mid m=0.3)$ and $p(a=0mid m=0.3)$ and whichever class gives the higher probability then it is the answer. However, when I am executing this thought I have a problem of computing $p(m=0.3mid a=1)$ which is supposed to be zero since it follows $U(0.2,1)$. I feel like I can use the density function to compute this probability but I am not sure why?










share|cite|improve this question











$endgroup$











  • $begingroup$
    Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
    $endgroup$
    – Cliff AB
    Feb 8 at 20:39










  • $begingroup$
    As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
    $endgroup$
    – prony
    Feb 8 at 20:42











  • $begingroup$
    oh sorry, I get your question now. I'll write up an answer.
    $endgroup$
    – Cliff AB
    Feb 8 at 21:05















1












$begingroup$


I am stuck in this question:



if $a = 1$ then $m sim U(0.2,1)$ else if $a=0$ then $m sim U(0,0.5)$. The question is if $m$ is $0.3$ what is the probability that $a$ equals to 1?



My thought is to compute $p(a=1mid m=0.3)$ and $p(a=0mid m=0.3)$ and whichever class gives the higher probability then it is the answer. However, when I am executing this thought I have a problem of computing $p(m=0.3mid a=1)$ which is supposed to be zero since it follows $U(0.2,1)$. I feel like I can use the density function to compute this probability but I am not sure why?










share|cite|improve this question











$endgroup$











  • $begingroup$
    Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
    $endgroup$
    – Cliff AB
    Feb 8 at 20:39










  • $begingroup$
    As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
    $endgroup$
    – prony
    Feb 8 at 20:42











  • $begingroup$
    oh sorry, I get your question now. I'll write up an answer.
    $endgroup$
    – Cliff AB
    Feb 8 at 21:05













1












1








1





$begingroup$


I am stuck in this question:



if $a = 1$ then $m sim U(0.2,1)$ else if $a=0$ then $m sim U(0,0.5)$. The question is if $m$ is $0.3$ what is the probability that $a$ equals to 1?



My thought is to compute $p(a=1mid m=0.3)$ and $p(a=0mid m=0.3)$ and whichever class gives the higher probability then it is the answer. However, when I am executing this thought I have a problem of computing $p(m=0.3mid a=1)$ which is supposed to be zero since it follows $U(0.2,1)$. I feel like I can use the density function to compute this probability but I am not sure why?










share|cite|improve this question











$endgroup$




I am stuck in this question:



if $a = 1$ then $m sim U(0.2,1)$ else if $a=0$ then $m sim U(0,0.5)$. The question is if $m$ is $0.3$ what is the probability that $a$ equals to 1?



My thought is to compute $p(a=1mid m=0.3)$ and $p(a=0mid m=0.3)$ and whichever class gives the higher probability then it is the answer. However, when I am executing this thought I have a problem of computing $p(m=0.3mid a=1)$ which is supposed to be zero since it follows $U(0.2,1)$. I feel like I can use the density function to compute this probability but I am not sure why?







probability bayesian






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Feb 8 at 21:47









Michael Hardy

3,8651430




3,8651430










asked Feb 8 at 19:45









pronyprony

326




326











  • $begingroup$
    Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
    $endgroup$
    – Cliff AB
    Feb 8 at 20:39










  • $begingroup$
    As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
    $endgroup$
    – prony
    Feb 8 at 20:42











  • $begingroup$
    oh sorry, I get your question now. I'll write up an answer.
    $endgroup$
    – Cliff AB
    Feb 8 at 21:05
















  • $begingroup$
    Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
    $endgroup$
    – Cliff AB
    Feb 8 at 20:39










  • $begingroup$
    As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
    $endgroup$
    – prony
    Feb 8 at 20:42











  • $begingroup$
    oh sorry, I get your question now. I'll write up an answer.
    $endgroup$
    – Cliff AB
    Feb 8 at 21:05















$begingroup$
Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
$endgroup$
– Cliff AB
Feb 8 at 20:39




$begingroup$
Big hint: Bayes Rule. Think of the probability you want to compute, and the probabilities you already know.
$endgroup$
– Cliff AB
Feb 8 at 20:39












$begingroup$
As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
$endgroup$
– prony
Feb 8 at 20:42





$begingroup$
As I mentioned, I am already implementing the Bayes rule. but I have to compute p(m=0.3|a=1) which follows U(0.2,1). In other words p(u=m=0.3) which is zero (because it is a continuous pdf). But I know it should be not zero
$endgroup$
– prony
Feb 8 at 20:42













$begingroup$
oh sorry, I get your question now. I'll write up an answer.
$endgroup$
– Cliff AB
Feb 8 at 21:05




$begingroup$
oh sorry, I get your question now. I'll write up an answer.
$endgroup$
– Cliff AB
Feb 8 at 21:05










3 Answers
3






active

oldest

votes


















2












$begingroup$

If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begincases 1/(1-0.2) = 1.25 & textif 0.2le xle 1, \ 0 & textif x<0.2 text or x>1. endcases$



If $a=0,$ it is $displaystyle f_m(x) = begincases 1/(0.5-0) = 2 & textif 0le xle0.5, \ 0 & textif x<0 text or x>0.5. endcases$



Thus the likelihood function is
$$
begincases L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. endcases
$$



Hence the posterior probability distribution is
$$
begincases Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). endcases tag 1
$$

The normalizing constant is
$$
c = frac 1 1.25Pr(a=1) + 2Pr(a=0)
$$

(that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)



So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 13.$






share|cite|improve this answer









$endgroup$












  • $begingroup$
    Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
    $endgroup$
    – prony
    Feb 8 at 23:01







  • 1




    $begingroup$
    @prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain0,1,$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain0,1$ with $x$ fixed at the observed value, which is $0.3. qquad$
    $endgroup$
    – Michael Hardy
    Feb 8 at 23:23



















3












$begingroup$

Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.



With discrete probabilities, it is simple to define



$$P(A = a mid B = b) = fracP(A = acap B = b)P(B = b)$$



As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$



Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value



$$ fracP(A in a pm varepsilon cap B in b pm varepsilon)P(B in b pm varepsilon) $$



is properly defined for all $epsilon$, as long as $int_b - varepsilon^b + varepsilon f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as



$$lim_varepsilon rightarrow 0 fracP(A in a pm varepsilon cap B in b pm varepsilon) / varepsilonP(B in b pm varepsilon) / varepsilon $$



By definition, this is



$$frac f_A, B(a, b) f_B(b)$$






share|cite|improve this answer











$endgroup$




















    1












    $begingroup$

    The answer given by Michael Hardy is correct. I just want to post a different interpretation of the same answer in case anyone finds it easier to follow:



    I would like to compute the probability of $p(a=1|m=3)$:



    $p(a=1|m=3) = fraca=1)p(a=1)p(m=3)$.



    In this equation, $p(a=1)$ is a discrete probability which I can compute. The problem is computing $p(m=3)$ and $p(m=3|a=1)p(a=1)$ since $m$ is a continous random variable. Here is the solution:



    For any contious variable, $f_X(x)$, we can write the probability of $x$ being equal to any constant c as follows:



    $f_x(x=c) = f_x(c)dx$ which is the infinitely small area that amounts to the probability that we want to compute.



    Applying this logic to above given equations, we:



    $p(m=3|a=1) = f_m(m=3)dm p(a=1)$ and $f_m(m=3) = f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0)$.



    After putting everything together:



    $p(a=1|m=3) = fraca=1)p(a=1)p(m=3) = fracf_m(m=3)dm p(a=1)f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0) = fracf_m(m=3)p(a=1)f_m(m=3)p(a=1) + f_m(m=3)p(a=0)$



    As seen this answer equals to the answer given by Michael Hardy.






    share|cite|improve this answer











    $endgroup$












      Your Answer





      StackExchange.ifUsing("editor", function ()
      return StackExchange.using("mathjaxEditing", function ()
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      );
      );
      , "mathjax-editing");

      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "65"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













      draft saved

      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f391519%2fbayesian-probability-question-pointwise-probability%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      2












      $begingroup$

      If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begincases 1/(1-0.2) = 1.25 & textif 0.2le xle 1, \ 0 & textif x<0.2 text or x>1. endcases$



      If $a=0,$ it is $displaystyle f_m(x) = begincases 1/(0.5-0) = 2 & textif 0le xle0.5, \ 0 & textif x<0 text or x>0.5. endcases$



      Thus the likelihood function is
      $$
      begincases L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. endcases
      $$



      Hence the posterior probability distribution is
      $$
      begincases Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). endcases tag 1
      $$

      The normalizing constant is
      $$
      c = frac 1 1.25Pr(a=1) + 2Pr(a=0)
      $$

      (that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)



      So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 13.$






      share|cite|improve this answer









      $endgroup$












      • $begingroup$
        Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
        $endgroup$
        – prony
        Feb 8 at 23:01







      • 1




        $begingroup$
        @prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain0,1,$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain0,1$ with $x$ fixed at the observed value, which is $0.3. qquad$
        $endgroup$
        – Michael Hardy
        Feb 8 at 23:23
















      2












      $begingroup$

      If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begincases 1/(1-0.2) = 1.25 & textif 0.2le xle 1, \ 0 & textif x<0.2 text or x>1. endcases$



      If $a=0,$ it is $displaystyle f_m(x) = begincases 1/(0.5-0) = 2 & textif 0le xle0.5, \ 0 & textif x<0 text or x>0.5. endcases$



      Thus the likelihood function is
      $$
      begincases L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. endcases
      $$



      Hence the posterior probability distribution is
      $$
      begincases Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). endcases tag 1
      $$

      The normalizing constant is
      $$
      c = frac 1 1.25Pr(a=1) + 2Pr(a=0)
      $$

      (that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)



      So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 13.$






      share|cite|improve this answer









      $endgroup$












      • $begingroup$
        Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
        $endgroup$
        – prony
        Feb 8 at 23:01







      • 1




        $begingroup$
        @prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain0,1,$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain0,1$ with $x$ fixed at the observed value, which is $0.3. qquad$
        $endgroup$
        – Michael Hardy
        Feb 8 at 23:23














      2












      2








      2





      $begingroup$

      If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begincases 1/(1-0.2) = 1.25 & textif 0.2le xle 1, \ 0 & textif x<0.2 text or x>1. endcases$



      If $a=0,$ it is $displaystyle f_m(x) = begincases 1/(0.5-0) = 2 & textif 0le xle0.5, \ 0 & textif x<0 text or x>0.5. endcases$



      Thus the likelihood function is
      $$
      begincases L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. endcases
      $$



      Hence the posterior probability distribution is
      $$
      begincases Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). endcases tag 1
      $$

      The normalizing constant is
      $$
      c = frac 1 1.25Pr(a=1) + 2Pr(a=0)
      $$

      (that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)



      So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 13.$






      share|cite|improve this answer









      $endgroup$



      If $a=1$ then the density function of $m$ is $displaystyle f_m(x) = begincases 1/(1-0.2) = 1.25 & textif 0.2le xle 1, \ 0 & textif x<0.2 text or x>1. endcases$



      If $a=0,$ it is $displaystyle f_m(x) = begincases 1/(0.5-0) = 2 & textif 0le xle0.5, \ 0 & textif x<0 text or x>0.5. endcases$



      Thus the likelihood function is
      $$
      begincases L( 1 mid m=0.3) = 1.25, \ L(0 mid m=0.3) = 2. endcases
      $$



      Hence the posterior probability distribution is
      $$
      begincases Pr(a=1mid m=0.3) = ctimes 1.25timesPr(a=1), \[5pt] Pr(a=0mid m = 0.3) = ctimes 2 times Pr(a=0). endcases tag 1
      $$

      The normalizing constant is
      $$
      c = frac 1 1.25Pr(a=1) + 2Pr(a=0)
      $$

      (that is what $c$ must be to make the sum of the two probabilities in line $(1)$ above equal to $1.$)



      So for example, if $Pr(a=1)=Pr(a=0) = dfrac 1 2$ then $Pr(a=1mid m=0.3) = dfrac 5 13.$







      share|cite|improve this answer












      share|cite|improve this answer



      share|cite|improve this answer










      answered Feb 8 at 22:04









      Michael HardyMichael Hardy

      3,8651430




      3,8651430











      • $begingroup$
        Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
        $endgroup$
        – prony
        Feb 8 at 23:01







      • 1




        $begingroup$
        @prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain0,1,$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain0,1$ with $x$ fixed at the observed value, which is $0.3. qquad$
        $endgroup$
        – Michael Hardy
        Feb 8 at 23:23

















      • $begingroup$
        Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
        $endgroup$
        – prony
        Feb 8 at 23:01







      • 1




        $begingroup$
        @prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain0,1,$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain0,1$ with $x$ fixed at the observed value, which is $0.3. qquad$
        $endgroup$
        – Michael Hardy
        Feb 8 at 23:23
















      $begingroup$
      Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
      $endgroup$
      – prony
      Feb 8 at 23:01





      $begingroup$
      Thanks a lot for your answer. Can you elaborate on how you constructed the likelihood from $f_m(x)$. I mean how $P(x=0.3|a=1) = f_m(x=0.3|a=1)$ becomes equal to $L(1|m=0.3)$
      $endgroup$
      – prony
      Feb 8 at 23:01





      1




      1




      $begingroup$
      @prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain0,1,$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain0,1$ with $x$ fixed at the observed value, which is $0.3. qquad$
      $endgroup$
      – Michael Hardy
      Feb 8 at 23:23





      $begingroup$
      @prony : You have a density function $xmapsto f_n(x mid a=alpha),$ where $alphain0,1,$ which is a function of $x,$ with $alpha$ fixed, and the likelihood function $alphamapsto L(alphamid m = x) = f_m(xmid a=alpha),$ which is a function of $alphain0,1$ with $x$ fixed at the observed value, which is $0.3. qquad$
      $endgroup$
      – Michael Hardy
      Feb 8 at 23:23














      3












      $begingroup$

      Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.



      With discrete probabilities, it is simple to define



      $$P(A = a mid B = b) = fracP(A = acap B = b)P(B = b)$$



      As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$



      Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value



      $$ fracP(A in a pm varepsilon cap B in b pm varepsilon)P(B in b pm varepsilon) $$



      is properly defined for all $epsilon$, as long as $int_b - varepsilon^b + varepsilon f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as



      $$lim_varepsilon rightarrow 0 fracP(A in a pm varepsilon cap B in b pm varepsilon) / varepsilonP(B in b pm varepsilon) / varepsilon $$



      By definition, this is



      $$frac f_A, B(a, b) f_B(b)$$






      share|cite|improve this answer











      $endgroup$

















        3












        $begingroup$

        Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.



        With discrete probabilities, it is simple to define



        $$P(A = a mid B = b) = fracP(A = acap B = b)P(B = b)$$



        As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$



        Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value



        $$ fracP(A in a pm varepsilon cap B in b pm varepsilon)P(B in b pm varepsilon) $$



        is properly defined for all $epsilon$, as long as $int_b - varepsilon^b + varepsilon f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as



        $$lim_varepsilon rightarrow 0 fracP(A in a pm varepsilon cap B in b pm varepsilon) / varepsilonP(B in b pm varepsilon) / varepsilon $$



        By definition, this is



        $$frac f_A, B(a, b) f_B(b)$$






        share|cite|improve this answer











        $endgroup$















          3












          3








          3





          $begingroup$

          Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.



          With discrete probabilities, it is simple to define



          $$P(A = a mid B = b) = fracP(A = acap B = b)P(B = b)$$



          As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$



          Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value



          $$ fracP(A in a pm varepsilon cap B in b pm varepsilon)P(B in b pm varepsilon) $$



          is properly defined for all $epsilon$, as long as $int_b - varepsilon^b + varepsilon f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as



          $$lim_varepsilon rightarrow 0 fracP(A in a pm varepsilon cap B in b pm varepsilon) / varepsilonP(B in b pm varepsilon) / varepsilon $$



          By definition, this is



          $$frac f_A, B(a, b) f_B(b)$$






          share|cite|improve this answer











          $endgroup$



          Instead of Bayes rule, we should look closely at the definition of conditional probability. as Bayes rule is simply a small transformation of the definition of conditional probability.



          With discrete probabilities, it is simple to define



          $$P(A = a mid B = b) = fracP(A = acap B = b)P(B = b)$$



          As you pointed out, this would be ill-defined if both $A$ and $B$ were continuous, as it would result $0/0.$



          Instead, let's think about $P(A in a pm varepsilon mid B in b pm varepsilon)$ for a continuous distribution. Then the value



          $$ fracP(A in a pm varepsilon cap B in b pm varepsilon)P(B in b pm varepsilon) $$



          is properly defined for all $epsilon$, as long as $int_b - varepsilon^b + varepsilon f_b(x) , dx > 0 $. Finally, we just define the conditional distribution of $A|B$ as



          $$lim_varepsilon rightarrow 0 fracP(A in a pm varepsilon cap B in b pm varepsilon) / varepsilonP(B in b pm varepsilon) / varepsilon $$



          By definition, this is



          $$frac f_A, B(a, b) f_B(b)$$







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Feb 8 at 21:49









          Michael Hardy

          3,8651430




          3,8651430










          answered Feb 8 at 21:16









          Cliff ABCliff AB

          13.3k12567




          13.3k12567





















              1












              $begingroup$

              The answer given by Michael Hardy is correct. I just want to post a different interpretation of the same answer in case anyone finds it easier to follow:



              I would like to compute the probability of $p(a=1|m=3)$:



              $p(a=1|m=3) = fraca=1)p(a=1)p(m=3)$.



              In this equation, $p(a=1)$ is a discrete probability which I can compute. The problem is computing $p(m=3)$ and $p(m=3|a=1)p(a=1)$ since $m$ is a continous random variable. Here is the solution:



              For any contious variable, $f_X(x)$, we can write the probability of $x$ being equal to any constant c as follows:



              $f_x(x=c) = f_x(c)dx$ which is the infinitely small area that amounts to the probability that we want to compute.



              Applying this logic to above given equations, we:



              $p(m=3|a=1) = f_m(m=3)dm p(a=1)$ and $f_m(m=3) = f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0)$.



              After putting everything together:



              $p(a=1|m=3) = fraca=1)p(a=1)p(m=3) = fracf_m(m=3)dm p(a=1)f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0) = fracf_m(m=3)p(a=1)f_m(m=3)p(a=1) + f_m(m=3)p(a=0)$



              As seen this answer equals to the answer given by Michael Hardy.






              share|cite|improve this answer











              $endgroup$

















                1












                $begingroup$

                The answer given by Michael Hardy is correct. I just want to post a different interpretation of the same answer in case anyone finds it easier to follow:



                I would like to compute the probability of $p(a=1|m=3)$:



                $p(a=1|m=3) = fraca=1)p(a=1)p(m=3)$.



                In this equation, $p(a=1)$ is a discrete probability which I can compute. The problem is computing $p(m=3)$ and $p(m=3|a=1)p(a=1)$ since $m$ is a continous random variable. Here is the solution:



                For any contious variable, $f_X(x)$, we can write the probability of $x$ being equal to any constant c as follows:



                $f_x(x=c) = f_x(c)dx$ which is the infinitely small area that amounts to the probability that we want to compute.



                Applying this logic to above given equations, we:



                $p(m=3|a=1) = f_m(m=3)dm p(a=1)$ and $f_m(m=3) = f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0)$.



                After putting everything together:



                $p(a=1|m=3) = fraca=1)p(a=1)p(m=3) = fracf_m(m=3)dm p(a=1)f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0) = fracf_m(m=3)p(a=1)f_m(m=3)p(a=1) + f_m(m=3)p(a=0)$



                As seen this answer equals to the answer given by Michael Hardy.






                share|cite|improve this answer











                $endgroup$















                  1












                  1








                  1





                  $begingroup$

                  The answer given by Michael Hardy is correct. I just want to post a different interpretation of the same answer in case anyone finds it easier to follow:



                  I would like to compute the probability of $p(a=1|m=3)$:



                  $p(a=1|m=3) = fraca=1)p(a=1)p(m=3)$.



                  In this equation, $p(a=1)$ is a discrete probability which I can compute. The problem is computing $p(m=3)$ and $p(m=3|a=1)p(a=1)$ since $m$ is a continous random variable. Here is the solution:



                  For any contious variable, $f_X(x)$, we can write the probability of $x$ being equal to any constant c as follows:



                  $f_x(x=c) = f_x(c)dx$ which is the infinitely small area that amounts to the probability that we want to compute.



                  Applying this logic to above given equations, we:



                  $p(m=3|a=1) = f_m(m=3)dm p(a=1)$ and $f_m(m=3) = f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0)$.



                  After putting everything together:



                  $p(a=1|m=3) = fraca=1)p(a=1)p(m=3) = fracf_m(m=3)dm p(a=1)f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0) = fracf_m(m=3)p(a=1)f_m(m=3)p(a=1) + f_m(m=3)p(a=0)$



                  As seen this answer equals to the answer given by Michael Hardy.






                  share|cite|improve this answer











                  $endgroup$



                  The answer given by Michael Hardy is correct. I just want to post a different interpretation of the same answer in case anyone finds it easier to follow:



                  I would like to compute the probability of $p(a=1|m=3)$:



                  $p(a=1|m=3) = fraca=1)p(a=1)p(m=3)$.



                  In this equation, $p(a=1)$ is a discrete probability which I can compute. The problem is computing $p(m=3)$ and $p(m=3|a=1)p(a=1)$ since $m$ is a continous random variable. Here is the solution:



                  For any contious variable, $f_X(x)$, we can write the probability of $x$ being equal to any constant c as follows:



                  $f_x(x=c) = f_x(c)dx$ which is the infinitely small area that amounts to the probability that we want to compute.



                  Applying this logic to above given equations, we:



                  $p(m=3|a=1) = f_m(m=3)dm p(a=1)$ and $f_m(m=3) = f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0)$.



                  After putting everything together:



                  $p(a=1|m=3) = fraca=1)p(a=1)p(m=3) = fracf_m(m=3)dm p(a=1)f_m(m=3)dm p(a=1) + f_m(m=3)dm p(a=0) = fracf_m(m=3)p(a=1)f_m(m=3)p(a=1) + f_m(m=3)p(a=0)$



                  As seen this answer equals to the answer given by Michael Hardy.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Feb 13 at 15:58

























                  answered Feb 11 at 16:39









                  pronyprony

                  326




                  326



























                      draft saved

                      draft discarded
















































                      Thanks for contributing an answer to Cross Validated!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f391519%2fbayesian-probability-question-pointwise-probability%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown






                      Popular posts from this blog

                      How to check contact read email or not when send email to Individual?

                      Bahrain

                      Postfix configuration issue with fips on centos 7; mailgun relay