Computing the expectation of the number of balls in a box The 2019 Stack Overflow Developer Survey Results Are InThere is two boxes with one with 8 balls and one with 4 ballsdrawing balls from box without replacemntRandom distribution of colored balls into boxes.Optimal Number of White BallsCompute possible outcomes when get balls from a boxPoisson Approximation Problem involving putting balls into boxesCompute expected received balls from boxesput n balls into n boxesA question of probability regarding expectation and variance of a random variable.Distributing 5 distinct balls into 3 distinct boxes

Are spiders unable to hurt humans, especially very small spiders?

What is the motivation for a law requiring 2 parties to consent for recording a conversation

Output the Arecibo Message

Is it okay to consider publishing in my first year of PhD?

How to support a colleague who finds meetings extremely tiring?

Can a flute soloist sit?

How to type a long/em dash `—`

Will it cause any balance problems to have PCs level up and gain the benefits of a long rest mid-fight?

Match Roman Numerals

How to display lines in a file like ls displays files in a directory?

How to translate "being like"?

Likelihood that a superbug or lethal virus could come from a landfill

Are turbopumps lubricated?

Inverse Relationship Between Precision and Recall

What do hard-Brexiteers want with respect to the Irish border?

Can a rogue use sneak attack with weapons that have the thrown property even if they are not thrown?

Is it possible for absolutely everyone to attain enlightenment?

Why didn't the Event Horizon Telescope team mention Sagittarius A*?

How do you keep chess fun when your opponent constantly beats you?

Are there any other methods to apply to solving simultaneous equations?

Correct punctuation for showing a character's confusion

Why was M87 targeted for the Event Horizon Telescope instead of Sagittarius A*?

How can I define good in a religion that claims no moral authority?

Star Trek - X-shaped Item on Regula/Orbital Office Starbases



Computing the expectation of the number of balls in a box



The 2019 Stack Overflow Developer Survey Results Are InThere is two boxes with one with 8 balls and one with 4 ballsdrawing balls from box without replacemntRandom distribution of colored balls into boxes.Optimal Number of White BallsCompute possible outcomes when get balls from a boxPoisson Approximation Problem involving putting balls into boxesCompute expected received balls from boxesput n balls into n boxesA question of probability regarding expectation and variance of a random variable.Distributing 5 distinct balls into 3 distinct boxes










5












$begingroup$


  • There are $r$ boxes and $n$ balls.

  • Each ball is placed in a box with equal probability, independently of the other balls.

  • Let $X_i$ be the number of balls in box $i$,
    $1 leq i leq r$.

  • Compute $mathbbEleft[X_iright], mathbbEleft[X_iX_jright]$.

I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Are there any restrictions on $j$?
    $endgroup$
    – Sean Lee
    6 hours ago










  • $begingroup$
    @SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
    $endgroup$
    – 631
    6 hours ago










  • $begingroup$
    Computationally, the answer to the second part appears to be $fracn^2r^2$
    $endgroup$
    – Sean Lee
    5 hours ago










  • $begingroup$
    Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
    $endgroup$
    – Daniel Schepler
    21 mins ago















5












$begingroup$


  • There are $r$ boxes and $n$ balls.

  • Each ball is placed in a box with equal probability, independently of the other balls.

  • Let $X_i$ be the number of balls in box $i$,
    $1 leq i leq r$.

  • Compute $mathbbEleft[X_iright], mathbbEleft[X_iX_jright]$.

I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.










share|cite|improve this question











$endgroup$











  • $begingroup$
    Are there any restrictions on $j$?
    $endgroup$
    – Sean Lee
    6 hours ago










  • $begingroup$
    @SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
    $endgroup$
    – 631
    6 hours ago










  • $begingroup$
    Computationally, the answer to the second part appears to be $fracn^2r^2$
    $endgroup$
    – Sean Lee
    5 hours ago










  • $begingroup$
    Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
    $endgroup$
    – Daniel Schepler
    21 mins ago













5












5








5





$begingroup$


  • There are $r$ boxes and $n$ balls.

  • Each ball is placed in a box with equal probability, independently of the other balls.

  • Let $X_i$ be the number of balls in box $i$,
    $1 leq i leq r$.

  • Compute $mathbbEleft[X_iright], mathbbEleft[X_iX_jright]$.

I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.










share|cite|improve this question











$endgroup$




  • There are $r$ boxes and $n$ balls.

  • Each ball is placed in a box with equal probability, independently of the other balls.

  • Let $X_i$ be the number of balls in box $i$,
    $1 leq i leq r$.

  • Compute $mathbbEleft[X_iright], mathbbEleft[X_iX_jright]$.

I am preparing for an exam, and I have no idea how to approach this problem. Can someone push me in the right direction ?.







probability-theory






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 6 hours ago









Felix Marin

68.9k7110147




68.9k7110147










asked 6 hours ago









631631

585




585











  • $begingroup$
    Are there any restrictions on $j$?
    $endgroup$
    – Sean Lee
    6 hours ago










  • $begingroup$
    @SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
    $endgroup$
    – 631
    6 hours ago










  • $begingroup$
    Computationally, the answer to the second part appears to be $fracn^2r^2$
    $endgroup$
    – Sean Lee
    5 hours ago










  • $begingroup$
    Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
    $endgroup$
    – Daniel Schepler
    21 mins ago
















  • $begingroup$
    Are there any restrictions on $j$?
    $endgroup$
    – Sean Lee
    6 hours ago










  • $begingroup$
    @SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
    $endgroup$
    – 631
    6 hours ago










  • $begingroup$
    Computationally, the answer to the second part appears to be $fracn^2r^2$
    $endgroup$
    – Sean Lee
    5 hours ago










  • $begingroup$
    Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
    $endgroup$
    – Daniel Schepler
    21 mins ago















$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
6 hours ago




$begingroup$
Are there any restrictions on $j$?
$endgroup$
– Sean Lee
6 hours ago












$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
6 hours ago




$begingroup$
@SeanLee In the question, no. I'm guessing it would have the same restrictions as i.
$endgroup$
– 631
6 hours ago












$begingroup$
Computationally, the answer to the second part appears to be $fracn^2r^2$
$endgroup$
– Sean Lee
5 hours ago




$begingroup$
Computationally, the answer to the second part appears to be $fracn^2r^2$
$endgroup$
– Sean Lee
5 hours ago












$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
21 mins ago




$begingroup$
Have you studied covariance matrices, or vector-valued random variables, at all? That would seem to me to provide the most compact notation for solving this problem.
$endgroup$
– Daniel Schepler
21 mins ago










3 Answers
3






active

oldest

votes


















2












$begingroup$

Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



$$ mathbbE[X_i] = fracnr $$



Now, we would like to know what is $mathbbE[X_i X_j] $.



We begin by making the following observation:



$$X_i = n - sum_jneq iX_j $$



Which gives us:



$$ X_isum_jneq iX_j = nX_i - X_i^2$$



Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
&= frac1r mathbbE[nX_i] \
&= fracn^2r^2
endalign






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
    $endgroup$
    – Daniel Schepler
    1 hour ago











  • $begingroup$
    Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
    $endgroup$
    – Sean Lee
    1 hour ago







  • 1




    $begingroup$
    I've now expanded VHarisop's answer with my calculations for part two of the question.
    $endgroup$
    – Daniel Schepler
    49 mins ago


















4












$begingroup$

For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
Specifically, you know that for a fixed box, the probability of putting a ball in it
is $frac1r$. Let



$$
Y_k^(i) = begincases
1 &, text if ball $k$ was placed in box $i$ \
0 &, text otherwise
endcases,
$$

which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
Then you can write



$$
X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
$$




For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
$$
X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
$$

We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
$$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
In summary, if $i ne j$, then
$$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
    $endgroup$
    – Daniel Schepler
    51 mins ago


















0












$begingroup$

Think of placing the ball in box "$i$" as success and not placing it as a failure.



This situation can be represented using the Hypergeometric Distribution.
$$
P(X=k) = fracK choose k N- Kchoose n - kN choose n.
$$



$N$ is the population size (number of boxes $r$)



$K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



$n$ is the number of draws (the number of balls $n$).



$k$ is the number of observed successes (the number of balls in box "$i$").



The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
$$E[X_i]=nfrac1r=fracnr$$






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3184022%2fcomputing-the-expectation-of-the-number-of-balls-in-a-box%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



    $$ mathbbE[X_i] = fracnr $$



    Now, we would like to know what is $mathbbE[X_i X_j] $.



    We begin by making the following observation:



    $$X_i = n - sum_jneq iX_j $$



    Which gives us:



    $$ X_isum_jneq iX_j = nX_i - X_i^2$$



    Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



    beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
    &= frac1r mathbbE[nX_i] \
    &= fracn^2r^2
    endalign






    share|cite|improve this answer











    $endgroup$








    • 1




      $begingroup$
      If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
      $endgroup$
      – Daniel Schepler
      1 hour ago











    • $begingroup$
      Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
      $endgroup$
      – Sean Lee
      1 hour ago







    • 1




      $begingroup$
      I've now expanded VHarisop's answer with my calculations for part two of the question.
      $endgroup$
      – Daniel Schepler
      49 mins ago















    2












    $begingroup$

    Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



    $$ mathbbE[X_i] = fracnr $$



    Now, we would like to know what is $mathbbE[X_i X_j] $.



    We begin by making the following observation:



    $$X_i = n - sum_jneq iX_j $$



    Which gives us:



    $$ X_isum_jneq iX_j = nX_i - X_i^2$$



    Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



    beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
    &= frac1r mathbbE[nX_i] \
    &= fracn^2r^2
    endalign






    share|cite|improve this answer











    $endgroup$








    • 1




      $begingroup$
      If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
      $endgroup$
      – Daniel Schepler
      1 hour ago











    • $begingroup$
      Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
      $endgroup$
      – Sean Lee
      1 hour ago







    • 1




      $begingroup$
      I've now expanded VHarisop's answer with my calculations for part two of the question.
      $endgroup$
      – Daniel Schepler
      49 mins ago













    2












    2








    2





    $begingroup$

    Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



    $$ mathbbE[X_i] = fracnr $$



    Now, we would like to know what is $mathbbE[X_i X_j] $.



    We begin by making the following observation:



    $$X_i = n - sum_jneq iX_j $$



    Which gives us:



    $$ X_isum_jneq iX_j = nX_i - X_i^2$$



    Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



    beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
    &= frac1r mathbbE[nX_i] \
    &= fracn^2r^2
    endalign






    share|cite|improve this answer











    $endgroup$



    Since there are $r$ boxes and $n$ balls, and each ball is placed in a box with equal probability, we have:



    $$ mathbbE[X_i] = fracnr $$



    Now, we would like to know what is $mathbbE[X_i X_j] $.



    We begin by making the following observation:



    $$X_i = n - sum_jneq iX_j $$



    Which gives us:



    $$ X_isum_jneq iX_j = nX_i - X_i^2$$



    Now, fix $i$ (we can do this because of the symmetry in the question), and thus we have:



    beginalignmathbbE[X_i X_j] &= frac1rBig(mathbbE[X_i sum_jneq i X_j] + mathbbE[X_i^2]Big) \
    &= frac1r mathbbE[nX_i] \
    &= fracn^2r^2
    endalign







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited 5 hours ago

























    answered 6 hours ago









    Sean LeeSean Lee

    801214




    801214







    • 1




      $begingroup$
      If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
      $endgroup$
      – Daniel Schepler
      1 hour ago











    • $begingroup$
      Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
      $endgroup$
      – Sean Lee
      1 hour ago







    • 1




      $begingroup$
      I've now expanded VHarisop's answer with my calculations for part two of the question.
      $endgroup$
      – Daniel Schepler
      49 mins ago












    • 1




      $begingroup$
      If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
      $endgroup$
      – Daniel Schepler
      1 hour ago











    • $begingroup$
      Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
      $endgroup$
      – Sean Lee
      1 hour ago







    • 1




      $begingroup$
      I've now expanded VHarisop's answer with my calculations for part two of the question.
      $endgroup$
      – Daniel Schepler
      49 mins ago







    1




    1




    $begingroup$
    If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
    $endgroup$
    – Daniel Schepler
    1 hour ago





    $begingroup$
    If indeed $E(X_i X_j) = E(X_i) E(X_j)$ for $i in j$ then that implies zero correlation. I would expect a bit of negative correlation. (And indeed, my preliminary calculation based on the decomposition from VHarisop's answer seems to result in $E(X_i X_j) = fracn(n-1)r^2$ for $i ne j$ and $E(X_i^2) = fracnr + fracn(n-1)r^2$.)
    $endgroup$
    – Daniel Schepler
    1 hour ago













    $begingroup$
    Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
    $endgroup$
    – Sean Lee
    1 hour ago





    $begingroup$
    Yeah, it seemed a little strange to me initially, but its consistent with your results btw: $frac1r[(r-1)E(X_iX_j) + E(X_i^2)] = fracn^2r^2$
    $endgroup$
    – Sean Lee
    1 hour ago





    1




    1




    $begingroup$
    I've now expanded VHarisop's answer with my calculations for part two of the question.
    $endgroup$
    – Daniel Schepler
    49 mins ago




    $begingroup$
    I've now expanded VHarisop's answer with my calculations for part two of the question.
    $endgroup$
    – Daniel Schepler
    49 mins ago











    4












    $begingroup$

    For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
    Specifically, you know that for a fixed box, the probability of putting a ball in it
    is $frac1r$. Let



    $$
    Y_k^(i) = begincases
    1 &, text if ball $k$ was placed in box $i$ \
    0 &, text otherwise
    endcases,
    $$

    which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
    Then you can write



    $$
    X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
    $$




    For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
    $$
    X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
    mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
    $$

    We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
    $$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
    In summary, if $i ne j$, then
    $$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
    where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



    For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.






    share|cite|improve this answer











    $endgroup$








    • 1




      $begingroup$
      I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
      $endgroup$
      – Daniel Schepler
      51 mins ago















    4












    $begingroup$

    For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
    Specifically, you know that for a fixed box, the probability of putting a ball in it
    is $frac1r$. Let



    $$
    Y_k^(i) = begincases
    1 &, text if ball $k$ was placed in box $i$ \
    0 &, text otherwise
    endcases,
    $$

    which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
    Then you can write



    $$
    X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
    $$




    For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
    $$
    X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
    mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
    $$

    We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
    $$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
    In summary, if $i ne j$, then
    $$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
    where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



    For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.






    share|cite|improve this answer











    $endgroup$








    • 1




      $begingroup$
      I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
      $endgroup$
      – Daniel Schepler
      51 mins ago













    4












    4








    4





    $begingroup$

    For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
    Specifically, you know that for a fixed box, the probability of putting a ball in it
    is $frac1r$. Let



    $$
    Y_k^(i) = begincases
    1 &, text if ball $k$ was placed in box $i$ \
    0 &, text otherwise
    endcases,
    $$

    which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
    Then you can write



    $$
    X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
    $$




    For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
    $$
    X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
    mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
    $$

    We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
    $$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
    In summary, if $i ne j$, then
    $$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
    where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



    For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.






    share|cite|improve this answer











    $endgroup$



    For the first part, you can use linearity of expectation to compute $mathbbE[X_i]$.
    Specifically, you know that for a fixed box, the probability of putting a ball in it
    is $frac1r$. Let



    $$
    Y_k^(i) = begincases
    1 &, text if ball $k$ was placed in box $i$ \
    0 &, text otherwise
    endcases,
    $$

    which satisfies $mathbbE[Y_k^(i)] = mathbbP(Y_k^(i) = 1) = frac1r.$
    Then you can write



    $$
    X_i = sum_j=1^n Y_j^(i) Rightarrow mathbbEX_i = sum_j=1^n frac1r = fracnr.
    $$




    For the second part, you can proceed similarly: $X_i = sum_k=1^n Y_k^(i)$ and $X_j = sum_ell=1^n Y_ell^(j)$, so:
    $$
    X_i X_j = sum_k=1^n sum_ell=1^n Y_k^(i) Y_ell^(j) implies
    mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n mathbbE(Y_k^(i) Y_ell^(j)).
    $$

    We will first treat the case where $i ne j$. Then, for each term in the sum such that $k = ell$, we must have $Y_k^(i) Y_ell^(j) = Y_k^(i) Y_k^(j) = 0$ since it impossible for ball $k$ to be placed both in box $i$ and in box $j$. On the other hand, if $k ne ell$, then the events corresponding to $Y_k^(i)$ and $Y_ell^(j)$ are independent since the placement of balls $k$ and $ell$ are independent, which implies that $Y_k^(i)$ and $Y_ell^(j)$ are independent random variables. Therefore, in this case,
    $$mathbbE(Y_k^(i) Y_ell^(j)) = mathbbE(Y_k^(i)) mathbbE(Y_ell^(j)) = frac1r cdot frac1r.$$
    In summary, if $i ne j$, then
    $$mathbbE(X_i X_j) = sum_k=1^n sum_ell=1^n delta_k ne ell cdot frac1r^2 = fracn(n-1)r^2$$
    where $delta_k ne ell$ represents the indicator value which is 1 when $k ne ell$ and 0 when $k = ell$.



    For the case $i = j$, I will leave the similar computation of $mathbbE(X_i^2)$ to you, with just the hint that the difference is in the expected value of $mathbbE(Y_k^(i) Y_ell^(j))$ for the case $k = ell$.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited 53 mins ago









    Daniel Schepler

    9,3341821




    9,3341821










    answered 6 hours ago









    VHarisopVHarisop

    1,228421




    1,228421







    • 1




      $begingroup$
      I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
      $endgroup$
      – Daniel Schepler
      51 mins ago












    • 1




      $begingroup$
      I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
      $endgroup$
      – Daniel Schepler
      51 mins ago







    1




    1




    $begingroup$
    I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
    $endgroup$
    – Daniel Schepler
    51 mins ago




    $begingroup$
    I decided to add my solution to part two, using your notation, to your answer to avoid having an answer split between your part and a part I would post separately. Feel free to edit it more to your liking, or even revert the addition if you prefer.
    $endgroup$
    – Daniel Schepler
    51 mins ago











    0












    $begingroup$

    Think of placing the ball in box "$i$" as success and not placing it as a failure.



    This situation can be represented using the Hypergeometric Distribution.
    $$
    P(X=k) = fracK choose k N- Kchoose n - kN choose n.
    $$



    $N$ is the population size (number of boxes $r$)



    $K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



    $n$ is the number of draws (the number of balls $n$).



    $k$ is the number of observed successes (the number of balls in box "$i$").



    The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
    $$E[X_i]=nfrac1r=fracnr$$






    share|cite|improve this answer









    $endgroup$

















      0












      $begingroup$

      Think of placing the ball in box "$i$" as success and not placing it as a failure.



      This situation can be represented using the Hypergeometric Distribution.
      $$
      P(X=k) = fracK choose k N- Kchoose n - kN choose n.
      $$



      $N$ is the population size (number of boxes $r$)



      $K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



      $n$ is the number of draws (the number of balls $n$).



      $k$ is the number of observed successes (the number of balls in box "$i$").



      The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
      $$E[X_i]=nfrac1r=fracnr$$






      share|cite|improve this answer









      $endgroup$















        0












        0








        0





        $begingroup$

        Think of placing the ball in box "$i$" as success and not placing it as a failure.



        This situation can be represented using the Hypergeometric Distribution.
        $$
        P(X=k) = fracK choose k N- Kchoose n - kN choose n.
        $$



        $N$ is the population size (number of boxes $r$)



        $K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



        $n$ is the number of draws (the number of balls $n$).



        $k$ is the number of observed successes (the number of balls in box "$i$").



        The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
        $$E[X_i]=nfrac1r=fracnr$$






        share|cite|improve this answer









        $endgroup$



        Think of placing the ball in box "$i$" as success and not placing it as a failure.



        This situation can be represented using the Hypergeometric Distribution.
        $$
        P(X=k) = fracK choose k N- Kchoose n - kN choose n.
        $$



        $N$ is the population size (number of boxes $r$)



        $K$ is the number of success states in the population (just $1$ because the success is defined as placing the ball in box "$i$".)



        $n$ is the number of draws (the number of balls $n$).



        $k$ is the number of observed successes (the number of balls in box "$i$").



        The expectation of the Hypergeometric Distribution is $nfracKN$, hence the mean of your variable
        $$E[X_i]=nfrac1r=fracnr$$







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 6 hours ago









        RScrlliRScrlli

        761114




        761114



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3184022%2fcomputing-the-expectation-of-the-number-of-balls-in-a-box%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            名間水力發電廠 目录 沿革 設施 鄰近設施 註釋 外部連結 导航菜单23°50′10″N 120°42′41″E / 23.83611°N 120.71139°E / 23.83611; 120.7113923°50′10″N 120°42′41″E / 23.83611°N 120.71139°E / 23.83611; 120.71139計畫概要原始内容臺灣第一座BOT 模式開發的水力發電廠-名間水力電廠名間水力發電廠 水利署首件BOT案原始内容《小檔案》名間電廠 首座BOT水力發電廠原始内容名間電廠BOT - 經濟部水利署中區水資源局

            Prove that NP is closed under karp reduction?Space(n) not closed under Karp reductions - what about NTime(n)?Class P is closed under rotation?Prove or disprove that $NL$ is closed under polynomial many-one reductions$mathbfNC_2$ is closed under log-space reductionOn Karp reductionwhen can I know if a class (complexity) is closed under reduction (cook/karp)Check if class $PSPACE$ is closed under polyonomially space reductionIs NPSPACE also closed under polynomial-time reduction and under log-space reduction?Prove PSPACE is closed under complement?Prove PSPACE is closed under union?

            Is my guitar’s action too high? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)Strings too stiff on a recently purchased acoustic guitar | Cort AD880CEIs the action of my guitar really high?Μy little finger is too weak to play guitarWith guitar, how long should I give my fingers to strengthen / callous?When playing a fret the guitar sounds mutedPlaying (Barre) chords up the guitar neckI think my guitar strings are wound too tight and I can't play barre chordsF barre chord on an SG guitarHow to find to the right strings of a barre chord by feel?High action on higher fret on my steel acoustic guitar