Uncorrelatedness + Joint Normality = Independence. Why? Intuition and mechanics

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
4
down vote

favorite












Two variables that are uncorrelated are not necessarily independent, as is simply exemplified by the fact that $X$ and $X^2$ are uncorrelated but not independent. However, two variables that are uncorrelated AND jointly normally distributed are guaranteed to be independent. Can someone explain intuitively why this is true? What exactly does joint normality of two variables add to the knowledge of zero correlation between two variables, which leads us to conclude that these two variables MUST be independent?










share|cite|improve this question









New contributor




ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.















  • 1




    It is not generally the case that $X$ and $X^2$ are uncorrelated (unless you put particular conditions on the $X$ that would make them uncorrelated, but you mention none).
    – Glen_b
    2 hours ago










  • First, going back to the fact that correlation refers to linear relationships, please explain how X^2 is linearly related to X. Second, you seem to be stating that not only can X^2 and X be linearly related, but that they are linearly related more often than not, given the use of the word "generally". Please explain. Thank you.
    – ColorStatistics
    2 hours ago










  • @Glen_b is spot on: $X$ and $X^2$ are only uncorrelated if you specifically stipulate the range of $X$. For example Pearson's $r approx 0.98$ for $X$ and $X^2$ when restricting the sample of $Xsim mathcalN(0,1)$ to values of $X$ in the range greater than 1. Check it out yourself (R): X <- rnorm(n=10000); X2 <- X*X; cor(X[X>1],X2[X>1])
    – Alexis
    2 hours ago







  • 1




    @Alexis It's not just the range, but the how the probabilities distribute over those values within the range. If you change the distribution you change the correlation.
    – Glen_b
    1 hour ago






  • 1




    @ColorStatistics correlation is the degree of linear relationship, yes. However, the linear projection of $x^2$ onto $x$ may involve a substantial linear component. If you want to see an example with high linear correlation between a variable and its square, let $X$ take the values 0 and 1 with equal probability (e.g. record the number of heads in the toss of a single fair coin). Then corr$(X,X^2)=1$ (!). If you're free to specify the distribution of $X$, you can make the correlation between $X$ and $X^2$ take any value between $-1$ and $1$. ... ctd
    – Glen_b
    1 hour ago

















up vote
4
down vote

favorite












Two variables that are uncorrelated are not necessarily independent, as is simply exemplified by the fact that $X$ and $X^2$ are uncorrelated but not independent. However, two variables that are uncorrelated AND jointly normally distributed are guaranteed to be independent. Can someone explain intuitively why this is true? What exactly does joint normality of two variables add to the knowledge of zero correlation between two variables, which leads us to conclude that these two variables MUST be independent?










share|cite|improve this question









New contributor




ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.















  • 1




    It is not generally the case that $X$ and $X^2$ are uncorrelated (unless you put particular conditions on the $X$ that would make them uncorrelated, but you mention none).
    – Glen_b
    2 hours ago










  • First, going back to the fact that correlation refers to linear relationships, please explain how X^2 is linearly related to X. Second, you seem to be stating that not only can X^2 and X be linearly related, but that they are linearly related more often than not, given the use of the word "generally". Please explain. Thank you.
    – ColorStatistics
    2 hours ago










  • @Glen_b is spot on: $X$ and $X^2$ are only uncorrelated if you specifically stipulate the range of $X$. For example Pearson's $r approx 0.98$ for $X$ and $X^2$ when restricting the sample of $Xsim mathcalN(0,1)$ to values of $X$ in the range greater than 1. Check it out yourself (R): X <- rnorm(n=10000); X2 <- X*X; cor(X[X>1],X2[X>1])
    – Alexis
    2 hours ago







  • 1




    @Alexis It's not just the range, but the how the probabilities distribute over those values within the range. If you change the distribution you change the correlation.
    – Glen_b
    1 hour ago






  • 1




    @ColorStatistics correlation is the degree of linear relationship, yes. However, the linear projection of $x^2$ onto $x$ may involve a substantial linear component. If you want to see an example with high linear correlation between a variable and its square, let $X$ take the values 0 and 1 with equal probability (e.g. record the number of heads in the toss of a single fair coin). Then corr$(X,X^2)=1$ (!). If you're free to specify the distribution of $X$, you can make the correlation between $X$ and $X^2$ take any value between $-1$ and $1$. ... ctd
    – Glen_b
    1 hour ago













up vote
4
down vote

favorite









up vote
4
down vote

favorite











Two variables that are uncorrelated are not necessarily independent, as is simply exemplified by the fact that $X$ and $X^2$ are uncorrelated but not independent. However, two variables that are uncorrelated AND jointly normally distributed are guaranteed to be independent. Can someone explain intuitively why this is true? What exactly does joint normality of two variables add to the knowledge of zero correlation between two variables, which leads us to conclude that these two variables MUST be independent?










share|cite|improve this question









New contributor




ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











Two variables that are uncorrelated are not necessarily independent, as is simply exemplified by the fact that $X$ and $X^2$ are uncorrelated but not independent. However, two variables that are uncorrelated AND jointly normally distributed are guaranteed to be independent. Can someone explain intuitively why this is true? What exactly does joint normality of two variables add to the knowledge of zero correlation between two variables, which leads us to conclude that these two variables MUST be independent?







correlation normal-distribution independence joint-distribution






share|cite|improve this question









New contributor




ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|cite|improve this question









New contributor




ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this question




share|cite|improve this question








edited 5 hours ago









Michael Hardy

3,0201330




3,0201330






New contributor




ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 7 hours ago









ColorStatistics

504




504




New contributor




ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







  • 1




    It is not generally the case that $X$ and $X^2$ are uncorrelated (unless you put particular conditions on the $X$ that would make them uncorrelated, but you mention none).
    – Glen_b
    2 hours ago










  • First, going back to the fact that correlation refers to linear relationships, please explain how X^2 is linearly related to X. Second, you seem to be stating that not only can X^2 and X be linearly related, but that they are linearly related more often than not, given the use of the word "generally". Please explain. Thank you.
    – ColorStatistics
    2 hours ago










  • @Glen_b is spot on: $X$ and $X^2$ are only uncorrelated if you specifically stipulate the range of $X$. For example Pearson's $r approx 0.98$ for $X$ and $X^2$ when restricting the sample of $Xsim mathcalN(0,1)$ to values of $X$ in the range greater than 1. Check it out yourself (R): X <- rnorm(n=10000); X2 <- X*X; cor(X[X>1],X2[X>1])
    – Alexis
    2 hours ago







  • 1




    @Alexis It's not just the range, but the how the probabilities distribute over those values within the range. If you change the distribution you change the correlation.
    – Glen_b
    1 hour ago






  • 1




    @ColorStatistics correlation is the degree of linear relationship, yes. However, the linear projection of $x^2$ onto $x$ may involve a substantial linear component. If you want to see an example with high linear correlation between a variable and its square, let $X$ take the values 0 and 1 with equal probability (e.g. record the number of heads in the toss of a single fair coin). Then corr$(X,X^2)=1$ (!). If you're free to specify the distribution of $X$, you can make the correlation between $X$ and $X^2$ take any value between $-1$ and $1$. ... ctd
    – Glen_b
    1 hour ago













  • 1




    It is not generally the case that $X$ and $X^2$ are uncorrelated (unless you put particular conditions on the $X$ that would make them uncorrelated, but you mention none).
    – Glen_b
    2 hours ago










  • First, going back to the fact that correlation refers to linear relationships, please explain how X^2 is linearly related to X. Second, you seem to be stating that not only can X^2 and X be linearly related, but that they are linearly related more often than not, given the use of the word "generally". Please explain. Thank you.
    – ColorStatistics
    2 hours ago










  • @Glen_b is spot on: $X$ and $X^2$ are only uncorrelated if you specifically stipulate the range of $X$. For example Pearson's $r approx 0.98$ for $X$ and $X^2$ when restricting the sample of $Xsim mathcalN(0,1)$ to values of $X$ in the range greater than 1. Check it out yourself (R): X <- rnorm(n=10000); X2 <- X*X; cor(X[X>1],X2[X>1])
    – Alexis
    2 hours ago







  • 1




    @Alexis It's not just the range, but the how the probabilities distribute over those values within the range. If you change the distribution you change the correlation.
    – Glen_b
    1 hour ago






  • 1




    @ColorStatistics correlation is the degree of linear relationship, yes. However, the linear projection of $x^2$ onto $x$ may involve a substantial linear component. If you want to see an example with high linear correlation between a variable and its square, let $X$ take the values 0 and 1 with equal probability (e.g. record the number of heads in the toss of a single fair coin). Then corr$(X,X^2)=1$ (!). If you're free to specify the distribution of $X$, you can make the correlation between $X$ and $X^2$ take any value between $-1$ and $1$. ... ctd
    – Glen_b
    1 hour ago








1




1




It is not generally the case that $X$ and $X^2$ are uncorrelated (unless you put particular conditions on the $X$ that would make them uncorrelated, but you mention none).
– Glen_b
2 hours ago




It is not generally the case that $X$ and $X^2$ are uncorrelated (unless you put particular conditions on the $X$ that would make them uncorrelated, but you mention none).
– Glen_b
2 hours ago












First, going back to the fact that correlation refers to linear relationships, please explain how X^2 is linearly related to X. Second, you seem to be stating that not only can X^2 and X be linearly related, but that they are linearly related more often than not, given the use of the word "generally". Please explain. Thank you.
– ColorStatistics
2 hours ago




First, going back to the fact that correlation refers to linear relationships, please explain how X^2 is linearly related to X. Second, you seem to be stating that not only can X^2 and X be linearly related, but that they are linearly related more often than not, given the use of the word "generally". Please explain. Thank you.
– ColorStatistics
2 hours ago












@Glen_b is spot on: $X$ and $X^2$ are only uncorrelated if you specifically stipulate the range of $X$. For example Pearson's $r approx 0.98$ for $X$ and $X^2$ when restricting the sample of $Xsim mathcalN(0,1)$ to values of $X$ in the range greater than 1. Check it out yourself (R): X <- rnorm(n=10000); X2 <- X*X; cor(X[X>1],X2[X>1])
– Alexis
2 hours ago





@Glen_b is spot on: $X$ and $X^2$ are only uncorrelated if you specifically stipulate the range of $X$. For example Pearson's $r approx 0.98$ for $X$ and $X^2$ when restricting the sample of $Xsim mathcalN(0,1)$ to values of $X$ in the range greater than 1. Check it out yourself (R): X <- rnorm(n=10000); X2 <- X*X; cor(X[X>1],X2[X>1])
– Alexis
2 hours ago





1




1




@Alexis It's not just the range, but the how the probabilities distribute over those values within the range. If you change the distribution you change the correlation.
– Glen_b
1 hour ago




@Alexis It's not just the range, but the how the probabilities distribute over those values within the range. If you change the distribution you change the correlation.
– Glen_b
1 hour ago




1




1




@ColorStatistics correlation is the degree of linear relationship, yes. However, the linear projection of $x^2$ onto $x$ may involve a substantial linear component. If you want to see an example with high linear correlation between a variable and its square, let $X$ take the values 0 and 1 with equal probability (e.g. record the number of heads in the toss of a single fair coin). Then corr$(X,X^2)=1$ (!). If you're free to specify the distribution of $X$, you can make the correlation between $X$ and $X^2$ take any value between $-1$ and $1$. ... ctd
– Glen_b
1 hour ago





@ColorStatistics correlation is the degree of linear relationship, yes. However, the linear projection of $x^2$ onto $x$ may involve a substantial linear component. If you want to see an example with high linear correlation between a variable and its square, let $X$ take the values 0 and 1 with equal probability (e.g. record the number of heads in the toss of a single fair coin). Then corr$(X,X^2)=1$ (!). If you're free to specify the distribution of $X$, you can make the correlation between $X$ and $X^2$ take any value between $-1$ and $1$. ... ctd
– Glen_b
1 hour ago











2 Answers
2






active

oldest

votes

















up vote
5
down vote



accepted










The the joint probability density function (pdf) of bivariate normal distribution is:
$$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$



where



$$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
When $rho = 0$,
$$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
& = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$
.



So they are independent.






share|cite|improve this answer






















  • I was two lines slower than you! (+1)
    – jbowman
    5 hours ago






  • 1




    Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
    – ColorStatistics
    5 hours ago

















up vote
1
down vote













Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:



  • For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.


  • There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$


That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .



If the second one it true, then $operatornamecov(X,Y) = ac + bd.$



If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$



Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$






share|cite|improve this answer




















    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "65"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );






    ColorStatistics is a new contributor. Be nice, and check out our Code of Conduct.









     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f376229%2funcorrelatedness-joint-normality-independence-why-intuition-and-mechanics%23new-answer', 'question_page');

    );

    Post as a guest






























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    5
    down vote



    accepted










    The the joint probability density function (pdf) of bivariate normal distribution is:
    $$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$



    where



    $$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
    When $rho = 0$,
    $$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
    & = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$
    .



    So they are independent.






    share|cite|improve this answer






















    • I was two lines slower than you! (+1)
      – jbowman
      5 hours ago






    • 1




      Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
      – ColorStatistics
      5 hours ago














    up vote
    5
    down vote



    accepted










    The the joint probability density function (pdf) of bivariate normal distribution is:
    $$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$



    where



    $$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
    When $rho = 0$,
    $$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
    & = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$
    .



    So they are independent.






    share|cite|improve this answer






















    • I was two lines slower than you! (+1)
      – jbowman
      5 hours ago






    • 1




      Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
      – ColorStatistics
      5 hours ago












    up vote
    5
    down vote



    accepted







    up vote
    5
    down vote



    accepted






    The the joint probability density function (pdf) of bivariate normal distribution is:
    $$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$



    where



    $$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
    When $rho = 0$,
    $$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
    & = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$
    .



    So they are independent.






    share|cite|improve this answer














    The the joint probability density function (pdf) of bivariate normal distribution is:
    $$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$



    where



    $$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
    When $rho = 0$,
    $$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
    & = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$
    .



    So they are independent.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited 5 hours ago









    Michael Hardy

    3,0201330




    3,0201330










    answered 6 hours ago









    a_statistician

    3,230139




    3,230139











    • I was two lines slower than you! (+1)
      – jbowman
      5 hours ago






    • 1




      Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
      – ColorStatistics
      5 hours ago
















    • I was two lines slower than you! (+1)
      – jbowman
      5 hours ago






    • 1




      Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
      – ColorStatistics
      5 hours ago















    I was two lines slower than you! (+1)
    – jbowman
    5 hours ago




    I was two lines slower than you! (+1)
    – jbowman
    5 hours ago




    1




    1




    Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
    – ColorStatistics
    5 hours ago




    Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
    – ColorStatistics
    5 hours ago












    up vote
    1
    down vote













    Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:



    • For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.


    • There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$


    That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .



    If the second one it true, then $operatornamecov(X,Y) = ac + bd.$



    If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$



    Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$






    share|cite|improve this answer
























      up vote
      1
      down vote













      Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:



      • For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.


      • There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$


      That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .



      If the second one it true, then $operatornamecov(X,Y) = ac + bd.$



      If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$



      Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$






      share|cite|improve this answer






















        up vote
        1
        down vote










        up vote
        1
        down vote









        Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:



        • For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.


        • There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$


        That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .



        If the second one it true, then $operatornamecov(X,Y) = ac + bd.$



        If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$



        Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$






        share|cite|improve this answer












        Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:



        • For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.


        • There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$


        That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .



        If the second one it true, then $operatornamecov(X,Y) = ac + bd.$



        If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$



        Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 5 hours ago









        Michael Hardy

        3,0201330




        3,0201330




















            ColorStatistics is a new contributor. Be nice, and check out our Code of Conduct.









             

            draft saved


            draft discarded


















            ColorStatistics is a new contributor. Be nice, and check out our Code of Conduct.












            ColorStatistics is a new contributor. Be nice, and check out our Code of Conduct.











            ColorStatistics is a new contributor. Be nice, and check out our Code of Conduct.













             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f376229%2funcorrelatedness-joint-normality-independence-why-intuition-and-mechanics%23new-answer', 'question_page');

            );

            Post as a guest













































































            Popular posts from this blog

            How to check contact read email or not when send email to Individual?

            Bahrain

            Postfix configuration issue with fips on centos 7; mailgun relay