Calculating Fisher Information for Bernoulli rv
Clash Royale CLAN TAG#URR8PPP
up vote
4
down vote
favorite
Let $X_1,...,X_n$ be Bernoulli distributed with unknown parameter $p$.
My objective is to calculate the information contained in the first observation of the sample.
I know that the pdf of $X$ is given by $$f(xmid p)=p^x(1-p)^1-x$$, and my book defines the Fisher information about $p$ as
$$I_X(p)=E_pleft[left(fracddplogleft(p^x(1-p)^1-xright)right)^2right]$$
After some calculations, I arrive at
$$I_X(p)=E_pleft[fracx^2p^2right]-2E_pleft[fracx(1-x)p(1-p)right]+E_pleft[frac(1-x)^2(1-p)^2right]$$
I know that the Fisher information about $p$ of a Bernoulli RV is $frac1p(1-p)$, but I don't know how to get rid of the X-values, since I'm calculating an expectation with respect to $p$, not $X$. Any clues?
statistics probability-distributions expected-value
add a comment |Â
up vote
4
down vote
favorite
Let $X_1,...,X_n$ be Bernoulli distributed with unknown parameter $p$.
My objective is to calculate the information contained in the first observation of the sample.
I know that the pdf of $X$ is given by $$f(xmid p)=p^x(1-p)^1-x$$, and my book defines the Fisher information about $p$ as
$$I_X(p)=E_pleft[left(fracddplogleft(p^x(1-p)^1-xright)right)^2right]$$
After some calculations, I arrive at
$$I_X(p)=E_pleft[fracx^2p^2right]-2E_pleft[fracx(1-x)p(1-p)right]+E_pleft[frac(1-x)^2(1-p)^2right]$$
I know that the Fisher information about $p$ of a Bernoulli RV is $frac1p(1-p)$, but I don't know how to get rid of the X-values, since I'm calculating an expectation with respect to $p$, not $X$. Any clues?
statistics probability-distributions expected-value
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
Sep 16 at 15:56
add a comment |Â
up vote
4
down vote
favorite
up vote
4
down vote
favorite
Let $X_1,...,X_n$ be Bernoulli distributed with unknown parameter $p$.
My objective is to calculate the information contained in the first observation of the sample.
I know that the pdf of $X$ is given by $$f(xmid p)=p^x(1-p)^1-x$$, and my book defines the Fisher information about $p$ as
$$I_X(p)=E_pleft[left(fracddplogleft(p^x(1-p)^1-xright)right)^2right]$$
After some calculations, I arrive at
$$I_X(p)=E_pleft[fracx^2p^2right]-2E_pleft[fracx(1-x)p(1-p)right]+E_pleft[frac(1-x)^2(1-p)^2right]$$
I know that the Fisher information about $p$ of a Bernoulli RV is $frac1p(1-p)$, but I don't know how to get rid of the X-values, since I'm calculating an expectation with respect to $p$, not $X$. Any clues?
statistics probability-distributions expected-value
Let $X_1,...,X_n$ be Bernoulli distributed with unknown parameter $p$.
My objective is to calculate the information contained in the first observation of the sample.
I know that the pdf of $X$ is given by $$f(xmid p)=p^x(1-p)^1-x$$, and my book defines the Fisher information about $p$ as
$$I_X(p)=E_pleft[left(fracddplogleft(p^x(1-p)^1-xright)right)^2right]$$
After some calculations, I arrive at
$$I_X(p)=E_pleft[fracx^2p^2right]-2E_pleft[fracx(1-x)p(1-p)right]+E_pleft[frac(1-x)^2(1-p)^2right]$$
I know that the Fisher information about $p$ of a Bernoulli RV is $frac1p(1-p)$, but I don't know how to get rid of the X-values, since I'm calculating an expectation with respect to $p$, not $X$. Any clues?
statistics probability-distributions expected-value
statistics probability-distributions expected-value
edited Sep 16 at 16:12
StubbornAtom
4,26211136
4,26211136
asked Sep 16 at 15:07
DavidS
1469
1469
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
Sep 16 at 15:56
add a comment |Â
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
Sep 16 at 15:56
1
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
Sep 16 at 15:56
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
Sep 16 at 15:56
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
4
down vote
accepted
beginequation
I_X(p)=E_p left[fracX^2p^2right]-2E_p left[ fracX - X^2p(1-p) right] + E_p left[ fracX^2 - 2X + 1(1-p)^2right] tag1.
endequation
For a Bernoulli RV, we know
beginalign
E(X) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(X^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p.
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1).
endequation
thanks @MichaelHardy
â Ahmad Bazzi
Sep 16 at 18:00
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
Sep 16 at 18:23
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
Sep 16 at 18:24
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
Sep 16 at 18:38
1
Thanks @AlejandroNasifSalum for the edit
â Ahmad Bazzi
Sep 17 at 23:28
 |Â
show 2 more comments
up vote
2
down vote
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(Xmid p) right)^2 right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim operatornameBe(p) implies X^nsim operatornameBe(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(Xmid p) right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2 right) = frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
4
down vote
accepted
beginequation
I_X(p)=E_p left[fracX^2p^2right]-2E_p left[ fracX - X^2p(1-p) right] + E_p left[ fracX^2 - 2X + 1(1-p)^2right] tag1.
endequation
For a Bernoulli RV, we know
beginalign
E(X) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(X^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p.
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1).
endequation
thanks @MichaelHardy
â Ahmad Bazzi
Sep 16 at 18:00
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
Sep 16 at 18:23
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
Sep 16 at 18:24
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
Sep 16 at 18:38
1
Thanks @AlejandroNasifSalum for the edit
â Ahmad Bazzi
Sep 17 at 23:28
 |Â
show 2 more comments
up vote
4
down vote
accepted
beginequation
I_X(p)=E_p left[fracX^2p^2right]-2E_p left[ fracX - X^2p(1-p) right] + E_p left[ fracX^2 - 2X + 1(1-p)^2right] tag1.
endequation
For a Bernoulli RV, we know
beginalign
E(X) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(X^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p.
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1).
endequation
thanks @MichaelHardy
â Ahmad Bazzi
Sep 16 at 18:00
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
Sep 16 at 18:23
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
Sep 16 at 18:24
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
Sep 16 at 18:38
1
Thanks @AlejandroNasifSalum for the edit
â Ahmad Bazzi
Sep 17 at 23:28
 |Â
show 2 more comments
up vote
4
down vote
accepted
up vote
4
down vote
accepted
beginequation
I_X(p)=E_p left[fracX^2p^2right]-2E_p left[ fracX - X^2p(1-p) right] + E_p left[ fracX^2 - 2X + 1(1-p)^2right] tag1.
endequation
For a Bernoulli RV, we know
beginalign
E(X) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(X^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p.
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1).
endequation
beginequation
I_X(p)=E_p left[fracX^2p^2right]-2E_p left[ fracX - X^2p(1-p) right] + E_p left[ fracX^2 - 2X + 1(1-p)^2right] tag1.
endequation
For a Bernoulli RV, we know
beginalign
E(X) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(X^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p.
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1).
endequation
edited Sep 17 at 23:13
Alejandro Nasif Salum
3,484117
3,484117
answered Sep 16 at 15:54
Ahmad Bazzi
6,9831724
6,9831724
thanks @MichaelHardy
â Ahmad Bazzi
Sep 16 at 18:00
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
Sep 16 at 18:23
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
Sep 16 at 18:24
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
Sep 16 at 18:38
1
Thanks @AlejandroNasifSalum for the edit
â Ahmad Bazzi
Sep 17 at 23:28
 |Â
show 2 more comments
thanks @MichaelHardy
â Ahmad Bazzi
Sep 16 at 18:00
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
Sep 16 at 18:23
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
Sep 16 at 18:24
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
Sep 16 at 18:38
1
Thanks @AlejandroNasifSalum for the edit
â Ahmad Bazzi
Sep 17 at 23:28
thanks @MichaelHardy
â Ahmad Bazzi
Sep 16 at 18:00
thanks @MichaelHardy
â Ahmad Bazzi
Sep 16 at 18:00
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
Sep 16 at 18:23
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
Sep 16 at 18:23
1
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
Sep 16 at 18:24
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
Sep 16 at 18:24
1
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
Sep 16 at 18:38
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
Sep 16 at 18:38
1
1
Thanks @AlejandroNasifSalum for the edit
â Ahmad Bazzi
Sep 17 at 23:28
Thanks @AlejandroNasifSalum for the edit
â Ahmad Bazzi
Sep 17 at 23:28
 |Â
show 2 more comments
up vote
2
down vote
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(Xmid p) right)^2 right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim operatornameBe(p) implies X^nsim operatornameBe(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(Xmid p) right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2 right) = frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
add a comment |Â
up vote
2
down vote
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(Xmid p) right)^2 right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim operatornameBe(p) implies X^nsim operatornameBe(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(Xmid p) right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2 right) = frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(Xmid p) right)^2 right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim operatornameBe(p) implies X^nsim operatornameBe(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(Xmid p) right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2 right) = frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(Xmid p) right)^2 right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim operatornameBe(p) implies X^nsim operatornameBe(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(Xmid p) right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2 right) = frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
edited Sep 17 at 14:43
Michael Hardy
206k23189469
206k23189469
answered Sep 16 at 15:46
Alejandro Nasif Salum
3,484117
3,484117
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2919044%2fcalculating-fisher-information-for-bernoulli-rv%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
Sep 16 at 15:56