### Learning Goals

To gain more experience with linearity of expectation and when it's useful, and to understand how covariance and correlation, both new concepts introduced today, can help measure how one random variable seems to vary with another

None!

### Concept Check

Q: Should that say “define X such that”?

A1:  thats a good idea :)

Q: The summations would iterate over x and y in the order a nested for loop would right?

A1:  absolutely. thinking it of a nested loop is perfect (and very helpful)

Q: Is the question supposed to be how many different kinds of coupons do you expect after buying n boxes? because doesn’t each box come with a coupon?

A1:  thats one question you could ask. Another one is how many boxes to buy before you expect to have collected them all!

Q: What does “Indicator for Ai” mean?

A1:  its a bernoulli which is 1 if event A_i occurs. indicators turn events into 1 or 0

Q: why do we say "success after (i-1)th success instead of just "ith success"?

A1:  to be clear that we are counting trials starting from (i-1) not from zero? But your point is well taken — if understood in context, those are the same

Q: What is the ith success in this problem? I just got kicked off the internet for the last 10 minutes.

A1:  its the ith time you hashed to a bucket that was previously empty! Welcome back. Sorry about the internet

Q: So i = 1 would correspond to a bucket going from empty to having something?

A1:  i = 1 would be the first time that a bucket went from empty to having something. i=2 would be the second time…

Q: Is Cov(X,Y) = Cov(Y,X)?

A1:  yes!

Q: Correlation and covariance are different right?

A1:  yes! Analogous to variance, and standard deviation

Q: So in practice how is a positive covariance different than a positive correlation? Could you draw roughly the same conclusions knowing either is true?

A1:  possitive correlation is simply scaled covariance (so that correlation is easy to compare across contexts)

Q: So covariance isn’t bounded but correlation is between 0 and 1 right?

A1:  exactly!

Q: does this mean that cov(ax, by) = abcov(x, y)?

A1:  that is right :). Here is a nice proof: http://www.stat.ucla.edu/~nchristo/introeconometrics/introecon_covariance_correlation.pdf

Q: is there any intuation as to why covariance = 0 and independance aren’t the same?

A1:  independence does imply covariance = 0. But a covariance of 0 could just be the positive and negative co-occurence cancelling each other