Feb 8th, 2021
Understand how to compute a basic convolution to compute a probability on two random variable, and to understand how the sum of Binomial RVs is also a Binomial (provided the probability of success is the same) and the sum of Poisson RVs is itself a Poisson.
Q: Were the mean/median quiz scores in Jerry’s email the scores out of 90 or as a percentage?
A1: The median grade of 77.5 was out of 90.
Q: Why do we divide P(H|D) by P(M|D)?
A1: it allows us to figure out who wrote the doc without having to compute P(D) or that big multinomial term. If the fraction is >1, Hamilton wrote it
Q: Does this mean that if we have priors on who we think wrote it (P(M) ≠ P(H)), then that actually affects the probabilities in the end?
A1: yes! prior beliefs can be controvertial!
Q: Why don’t you account for draws in question 1 part c on the quiz because chris can win 10 to 8 by winning 2 times, drawing 6 times, and losing 1 time?
A1: let’s discuss after lecture :)
Q: What would the answer have looked like if Hamilton had written it? I’m a little confused on how the -582 tells us Madison wrote it
A1: Great question. It would have been a positive number. If Hamilton wrote it, the input to the log would have been a number greater than one (recall that the input was the ratio of the probability that hamilton wrote it, over the probability that madison wrote it). If the input to a log is greater 1 (hamilton is author), then the output from the log is positive. It was very negative…
Q: What if the p's are different?
A1: you can compute a PDF — its just not the case that the sum is going to be a binomial
Q: Convolution is only when they are independent?
A1: no, convolution is a fancy way of saying “addition”. it applies whether they are independent or not
Q: Do they have to have the same support?
A1: not necessarily
Q: is there a threshold on how closely the indepencence equation has to hold for something to be considered independent?
A1: mathematically? It has to be exact. But in practice, you can compute your belief that they are independent (and discreptancy is a result of noise) or, more common, if its pretty close you assume they are independent. Your probabilistic model will be slightly wrong. But likely wont have much impact on any final answers it leads you to.
Q: when, more specifically, is n large enough and p small enough that we can approximate with poisson?
A1: There are lots of details here: https://chrispiech.github.io/probabilityForComputerScientists/en/part2/binomial_approx/
For Poisson n>20, p <0.05
Q: how could n + m not be equal to N here?
A1: Great question. You are right, that probability is 0
Is it the first factor in the second addend that is 0, or the second factor? or are they both 0?
A1: its the second factor P(n \neq n + m)
Q: why isn’t P(N = n + m) = 1?
A1: live answered
Q: It seems like regrade requests on the quiz are disabled?
A1: coming soon!