Computing the expectation of the number of balls in a box Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)There is two boxes with one with 8 balls and one with 4 ballsdrawing balls from box without replacemntRandom distribution of colored balls into boxes.Optimal Number of White BallsCompute possible outcomes when get balls from a boxPoisson Approximation Problem involving putting balls into boxesCompute expected received balls from boxesput n balls into n boxesA question of probability regarding expectation and variance of a random variable.Distributing 5 distinct balls into 3 distinct boxes
Is it true to say that an hosting provider's DNS server is what links the entire hosting environment to ICANN?
Sorting numerically
Did Kevin spill real chili?
Why is "Consequences inflicted." not a sentence?
Does accepting a pardon have any bearing on trying that person for the same crime in a sovereign jurisdiction?
Withdrew £2800, but only £2000 shows as withdrawn on online banking; what are my obligations?
Check which numbers satisfy the condition [A*B*C = A! + B! + C!]
Difference between these two cards?
Why one of virtual NICs called bond0?
Proof involving the spectral radius and Jordan Canonical form
What's the purpose of writing one's academic bio in 3rd person?
Models of set theory where not every set can be linearly ordered
How to draw this diagram using TikZ package?
How can players work together to take actions that are otherwise impossible?
Is the address of a local variable a constexpr?
The logistics of corpse disposal
Is above average number of years spent on PhD considered a red flag in future academia or industry positions?
How does a Death Domain cleric's Touch of Death feature work with Touch-range spells delivered by familiars?
Disable hyphenation for an entire paragraph
How to deal with a team lead who never gives me credit?
Is there a Spanish version of "dot your i's and cross your t's" that includes the letter 'ñ'?
How to recreate this effect in Photoshop?
When -s is used with third person singular. What's its use in this context?
Is a manifold-with-boundary with given interior and non-empty boundary essentially unique?
Computing the expectation of the number of balls in a box
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)There is two boxes with one with 8 balls and one with 4 ballsdrawing balls from box without replacemntRandom distribution of colored balls into boxes.Optimal Number of White BallsCompute possible outcomes when get balls from a boxPoisson Approximation Problem involving putting balls into boxesCompute expected received balls from boxesput n balls into n boxesA question of probability regarding expectation and variance of a random variable.Distributing 5 distinct balls into 3 distinct boxes
$begingroup$
- There are $r$ boxes and $n$ balls.
- Each ball is placed in a box with equal probability, independently of the other balls.
- Let $X_i$ be the number of balls in box $i$,
$1 leq i leq r$. - Compute $mathbbEleft[X_iright], mathbbEleft[X_iX_jright]$.
I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.
probability-theory
$endgroup$
add a comment |
$begingroup$
- There are $r$ boxes and $n$ balls.
- Each ball is placed in a box with equal probability, independently of the other balls.
- Let $X_i$ be the number of balls in box $i$,
$1 leq i leq r$. - Compute $mathbbEleft[X_iright], mathbbEleft[X_iX_jright]$.
I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.
probability-theory
$endgroup$
$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
Apr 11 at 17:32
$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
Apr 11 at 17:34
$begingroup$
Computationally, the answer to the second part appears to be $fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 18:17
$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
Apr 11 at 23:52
add a comment |
$begingroup$
- There are $r$ boxes and $n$ balls.
- Each ball is placed in a box with equal probability, independently of the other balls.
- Let $X_i$ be the number of balls in box $i$,
$1 leq i leq r$. - Compute $mathbbEleft[X_iright], mathbbEleft[X_iX_jright]$.
I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.
probability-theory
$endgroup$
- There are $r$ boxes and $n$ balls.
- Each ball is placed in a box with equal probability, independently of the other balls.
- Let $X_i$ be the number of balls in box $i$,
$1 leq i leq r$. - Compute $mathbbEleft[X_iright], mathbbEleft[X_iX_jright]$.
I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.
probability-theory
probability-theory
edited Apr 11 at 17:50
Felix Marin
69k7110147
69k7110147
asked Apr 11 at 17:25
631631
585
585
$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
Apr 11 at 17:32
$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
Apr 11 at 17:34
$begingroup$
Computationally, the answer to the second part appears to be $fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 18:17
$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
Apr 11 at 23:52
add a comment |
$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
Apr 11 at 17:32
$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
Apr 11 at 17:34
$begingroup$
Computationally, the answer to the second part appears to be $fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 18:17
$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
Apr 11 at 23:52
$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
Apr 11 at 17:32
$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
Apr 11 at 17:32
$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
Apr 11 at 17:34
$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
Apr 11 at 17:34
$begingroup$
Computationally, the answer to the second part appears to be $fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 18:17
$begingroup$
Computationally, the answer to the second part appears to be $fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 18:17
$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
Apr 11 at 23:52
$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
Apr 11 at 23:52
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:
$$ mathbbE[X_i] = fracnr $$
Now, we would like to know what is $mathbbE[X_i X_j] $.
We begin by making the following observation:
$$X_i = n - sum_jneq iX_j $$
Which gives us:
$$ X_isum_jneq iX_j = nX_i - X_i^2$$
Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:
beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
&= frac1r mathbbE[nX_i] \
&= fracn^2r^2
endalign
$endgroup$
1
$begingroup$
If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
$endgroup$
– Daniel Schepler
Apr 11 at 22:48
$begingroup$
Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 23:01
1
$begingroup$
I've now expanded VHarisop's answer with my calculations for part two of the question.
$endgroup$
– Daniel Schepler
Apr 11 at 23:23
add a comment |
$begingroup$
For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
Specifically, you know that for a fixed box, the probability of putting a ball in it
is $frac1r$. Let
$$
Y_k^(i) = begincases
1 &, text if ball $k$ was placed in box $i$ \
0 &, text otherwise
endcases,
$$
which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
Then you can write
$$
X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
$$
For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
$$
X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
$$
We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
$$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
In summary, if $i ne j$, then
$$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.
For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.
$endgroup$
1
$begingroup$
I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
$endgroup$
– Daniel Schepler
Apr 11 at 23:21
$begingroup$
@DanielSchepler: Looks good, thank you!
$endgroup$
– VHarisop
Apr 12 at 16:27
add a comment |
$begingroup$
Think of placing the ball in box "$i$" as success and not placing it as a failure.
This situation can be represented using the Hypergeometric Distribution.
$$
P(X=k) = fracK choose k N- Kchoose n - kN choose n.
$$
$N$ is the population size (number of boxes $r$)
$K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)
$n$ is the number of draws (the number of balls $n$).
$k$ is the number of observed successes (the number of balls in box "$i$").
The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
$$E[X_i]=nfrac1r=fracnr$$
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3184022%2fcomputing-the-expectation-of-the-number-of-balls-in-a-box%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:
$$ mathbbE[X_i] = fracnr $$
Now, we would like to know what is $mathbbE[X_i X_j] $.
We begin by making the following observation:
$$X_i = n - sum_jneq iX_j $$
Which gives us:
$$ X_isum_jneq iX_j = nX_i - X_i^2$$
Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:
beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
&= frac1r mathbbE[nX_i] \
&= fracn^2r^2
endalign
$endgroup$
1
$begingroup$
If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
$endgroup$
– Daniel Schepler
Apr 11 at 22:48
$begingroup$
Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 23:01
1
$begingroup$
I've now expanded VHarisop's answer with my calculations for part two of the question.
$endgroup$
– Daniel Schepler
Apr 11 at 23:23
add a comment |
$begingroup$
Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:
$$ mathbbE[X_i] = fracnr $$
Now, we would like to know what is $mathbbE[X_i X_j] $.
We begin by making the following observation:
$$X_i = n - sum_jneq iX_j $$
Which gives us:
$$ X_isum_jneq iX_j = nX_i - X_i^2$$
Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:
beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
&= frac1r mathbbE[nX_i] \
&= fracn^2r^2
endalign
$endgroup$
1
$begingroup$
If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
$endgroup$
– Daniel Schepler
Apr 11 at 22:48
$begingroup$
Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 23:01
1
$begingroup$
I've now expanded VHarisop's answer with my calculations for part two of the question.
$endgroup$
– Daniel Schepler
Apr 11 at 23:23
add a comment |
$begingroup$
Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:
$$ mathbbE[X_i] = fracnr $$
Now, we would like to know what is $mathbbE[X_i X_j] $.
We begin by making the following observation:
$$X_i = n - sum_jneq iX_j $$
Which gives us:
$$ X_isum_jneq iX_j = nX_i - X_i^2$$
Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:
beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
&= frac1r mathbbE[nX_i] \
&= fracn^2r^2
endalign
$endgroup$
Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:
$$ mathbbE[X_i] = fracnr $$
Now, we would like to know what is $mathbbE[X_i X_j] $.
We begin by making the following observation:
$$X_i = n - sum_jneq iX_j $$
Which gives us:
$$ X_isum_jneq iX_j = nX_i - X_i^2$$
Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:
beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
&= frac1r mathbbE[nX_i] \
&= fracn^2r^2
endalign
edited Apr 11 at 18:19
answered Apr 11 at 18:01
Sean LeeSean Lee
830214
830214
1
$begingroup$
If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
$endgroup$
– Daniel Schepler
Apr 11 at 22:48
$begingroup$
Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 23:01
1
$begingroup$
I've now expanded VHarisop's answer with my calculations for part two of the question.
$endgroup$
– Daniel Schepler
Apr 11 at 23:23
add a comment |
1
$begingroup$
If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
$endgroup$
– Daniel Schepler
Apr 11 at 22:48
$begingroup$
Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 23:01
1
$begingroup$
I've now expanded VHarisop's answer with my calculations for part two of the question.
$endgroup$
– Daniel Schepler
Apr 11 at 23:23
1
1
$begingroup$
If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
$endgroup$
– Daniel Schepler
Apr 11 at 22:48
$begingroup$
If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
$endgroup$
– Daniel Schepler
Apr 11 at 22:48
$begingroup$
Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 23:01
$begingroup$
Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 23:01
1
1
$begingroup$
I've now expanded VHarisop's answer with my calculations for part two of the question.
$endgroup$
– Daniel Schepler
Apr 11 at 23:23
$begingroup$
I've now expanded VHarisop's answer with my calculations for part two of the question.
$endgroup$
– Daniel Schepler
Apr 11 at 23:23
add a comment |
$begingroup$
For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
Specifically, you know that for a fixed box, the probability of putting a ball in it
is $frac1r$. Let
$$
Y_k^(i) = begincases
1 &, text if ball $k$ was placed in box $i$ \
0 &, text otherwise
endcases,
$$
which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
Then you can write
$$
X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
$$
For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
$$
X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
$$
We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
$$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
In summary, if $i ne j$, then
$$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.
For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.
$endgroup$
1
$begingroup$
I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
$endgroup$
– Daniel Schepler
Apr 11 at 23:21
$begingroup$
@DanielSchepler: Looks good, thank you!
$endgroup$
– VHarisop
Apr 12 at 16:27
add a comment |
$begingroup$
For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
Specifically, you know that for a fixed box, the probability of putting a ball in it
is $frac1r$. Let
$$
Y_k^(i) = begincases
1 &, text if ball $k$ was placed in box $i$ \
0 &, text otherwise
endcases,
$$
which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
Then you can write
$$
X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
$$
For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
$$
X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
$$
We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
$$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
In summary, if $i ne j$, then
$$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.
For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.
$endgroup$
1
$begingroup$
I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
$endgroup$
– Daniel Schepler
Apr 11 at 23:21
$begingroup$
@DanielSchepler: Looks good, thank you!
$endgroup$
– VHarisop
Apr 12 at 16:27
add a comment |
$begingroup$
For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
Specifically, you know that for a fixed box, the probability of putting a ball in it
is $frac1r$. Let
$$
Y_k^(i) = begincases
1 &, text if ball $k$ was placed in box $i$ \
0 &, text otherwise
endcases,
$$
which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
Then you can write
$$
X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
$$
For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
$$
X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
$$
We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
$$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
In summary, if $i ne j$, then
$$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.
For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.
$endgroup$
For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
Specifically, you know that for a fixed box, the probability of putting a ball in it
is $frac1r$. Let
$$
Y_k^(i) = begincases
1 &, text if ball $k$ was placed in box $i$ \
0 &, text otherwise
endcases,
$$
which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
Then you can write
$$
X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
$$
For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
$$
X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
$$
We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
$$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
In summary, if $i ne j$, then
$$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.
For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.
edited Apr 11 at 23:19
Daniel Schepler
9,3341821
9,3341821
answered Apr 11 at 17:48
VHarisopVHarisop
1,228421
1,228421
1
$begingroup$
I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
$endgroup$
– Daniel Schepler
Apr 11 at 23:21
$begingroup$
@DanielSchepler: Looks good, thank you!
$endgroup$
– VHarisop
Apr 12 at 16:27
add a comment |
1
$begingroup$
I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
$endgroup$
– Daniel Schepler
Apr 11 at 23:21
$begingroup$
@DanielSchepler: Looks good, thank you!
$endgroup$
– VHarisop
Apr 12 at 16:27
1
1
$begingroup$
I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
$endgroup$
– Daniel Schepler
Apr 11 at 23:21
$begingroup$
I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
$endgroup$
– Daniel Schepler
Apr 11 at 23:21
$begingroup$
@DanielSchepler: Looks good, thank you!
$endgroup$
– VHarisop
Apr 12 at 16:27
$begingroup$
@DanielSchepler: Looks good, thank you!
$endgroup$
– VHarisop
Apr 12 at 16:27
add a comment |
$begingroup$
Think of placing the ball in box "$i$" as success and not placing it as a failure.
This situation can be represented using the Hypergeometric Distribution.
$$
P(X=k) = fracK choose k N- Kchoose n - kN choose n.
$$
$N$ is the population size (number of boxes $r$)
$K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)
$n$ is the number of draws (the number of balls $n$).
$k$ is the number of observed successes (the number of balls in box "$i$").
The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
$$E[X_i]=nfrac1r=fracnr$$
$endgroup$
add a comment |
$begingroup$
Think of placing the ball in box "$i$" as success and not placing it as a failure.
This situation can be represented using the Hypergeometric Distribution.
$$
P(X=k) = fracK choose k N- Kchoose n - kN choose n.
$$
$N$ is the population size (number of boxes $r$)
$K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)
$n$ is the number of draws (the number of balls $n$).
$k$ is the number of observed successes (the number of balls in box "$i$").
The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
$$E[X_i]=nfrac1r=fracnr$$
$endgroup$
add a comment |
$begingroup$
Think of placing the ball in box "$i$" as success and not placing it as a failure.
This situation can be represented using the Hypergeometric Distribution.
$$
P(X=k) = fracK choose k N- Kchoose n - kN choose n.
$$
$N$ is the population size (number of boxes $r$)
$K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)
$n$ is the number of draws (the number of balls $n$).
$k$ is the number of observed successes (the number of balls in box "$i$").
The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
$$E[X_i]=nfrac1r=fracnr$$
$endgroup$
Think of placing the ball in box "$i$" as success and not placing it as a failure.
This situation can be represented using the Hypergeometric Distribution.
$$
P(X=k) = fracK choose k N- Kchoose n - kN choose n.
$$
$N$ is the population size (number of boxes $r$)
$K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)
$n$ is the number of draws (the number of balls $n$).
$k$ is the number of observed successes (the number of balls in box "$i$").
The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
$$E[X_i]=nfrac1r=fracnr$$
answered Apr 11 at 18:00
RScrlliRScrlli
763114
763114
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3184022%2fcomputing-the-expectation-of-the-number-of-balls-in-a-box%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
Apr 11 at 17:32
$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
Apr 11 at 17:34
$begingroup$
Computationally, the answer to the second part appears to be $fracn^2r^2$
$endgroup$
– Sean Lee
Apr 11 at 18:17
$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
Apr 11 at 23:52