Lecture Materials


Learning Goals

By the end of Friday's lecture, you should understand what the Poisson distribution is and why it's often used to model the number of rare event occurences during a fixed period of time. You should also understand the Geometric and Negative Binomial distributions and how they're used when we're interested in the number of Bernoulli trials needed before a particular number of successes.

Reading

Poisson

Concept Check

https://www.gradescope.com/courses/226051/assignments/976438

Questions & Answers


Q: when does pset 3 come out?

A1:  live answered


Q: I got an email from Gradescope that my PSET 1 was graded, but on gradescope it says it has not been graded. Should I just sit tight?

A1:  live answered


Q: nice haircut!

A1:  live answered


Q: why did we do to the power of k?

A1:  live answered


Q: what are the application requirements?

A1:  Here are some helpful links for the CS198 program that Ella presented about: About: https://cs198.stanford.edu/cs198/ProgramStructure.aspx How to Apply: https://cs198.stanford.edu/cs198/Apply.aspx


Q: but when k is 2 then it assumes we get 10 requests not 2?

A1:  When k = 2 it assumes that we get 2 requests, and 10 of the slots do not get requests!


Q: so then its 5/60^2? - i guess why do we keep the 5 requests when we only have two requests?

A1:  5/60 is the probability that any of the individual intervals is assigned a request. Since we historically see 5 requests per minute that is our expectation. However we can ask the question, what is the probability that we see less than 5 in a minute aka P(X = k = 2) for example. That’s equivalent to asking the question, “Whats the probability of getting two requests in a minute, given that we know our lambda parameter has historically been 5”


Q: what do you mean by rare events?

A1:  See slide 11 (# earthquakes per year, # server hits per second, # emails per day)


Q: Does this break down when it’s not rare events? So a really large number of average riders per minute

A1:  The most important requirement is that we have a constant rate lambda between our events. Turns out Poisson is pretty robust :)


Q: Is the screen frozen?

A1:  live answered


Q: since bernoulli / geometric random variables are just special cases of binomial / negative binomial, what is the motivation for introducing them independently?

A1:  Correct, Bernoulli and Geometric are special cases of binomial and negative respectively. The former distributions require fewer parameters, and also there is an interesting relationship between combining multiple random variables of the former kinds to equal the latter kinds.

A2:  i.e. the sum of independent bernoulli random variables is distributed as a binominal. This will be an interesting fact when we learn later about combining random variables.

A3:  But bear in mind that if the bernoulli random variables are NOT independent, then their sum (which is itself a random variable) will not necessarily be distributed as a binomial.


Q: Why would you use the approximation when the poisson is already fairly easy to use?

A1:  The Poisson is a good approximation for the Binomial in certain cases, not the other way around! :) For example for X ~ Bin(n,p) where n is large and p is really small, then we can actually use the poisson to approximate it.


Q: what does “not entirely independant” mean? that they are dependant?

A1:  live answered

A2:  Yes. Knowing the result of one trial gives you a little more confidence about what the next trial is. For example, if p=0.5, knownig the outcome of one trial might bump another trial to .51

A3:  likewise, it might bump you lower to 0.49. As long as the probability of each trial is not always .5, we cant say they are strictly independent.


Q: is lambda always equal to np?

A1:  It is always equal if we are trying to model a binomial with a poisson

A2:  We are using n*p to model a corresponding poisson distribution

A3:  We are using n*p to model a corresponding poisson distributed.


Q: Is the variance always lambda for poisson or just when we approximate the binomial?

A1:  Variance of poisson is always lambda


Q: how do you see if a probability distribution is poissonian?

A1:  Good question - to restate, how do you predict what the right distribution just from data points? We'll learn this in the latter part of the course :)