Andrew's Monotone Chain, Conditional & Joint Probability, Bayes' Theorem

Channel:
Subscribers:
2,640
Published on ● Video Link: https://www.youtube.com/watch?v=ojzwM9cvWvs



Duration: 2:19:08
155 views
3


First hour of class we covered my favorite convex hull algorithm, one in which you build a "convex hull" (a convex polygon is one in which you can view all the points inside a polygon from all the points inside the polygon) which is like a boundary that tightly wraps around a point field, allowing you to quickly determine if something is inside or outside of that boundary using the powers of linear algebra (cross product, specifically).

We then moved on to conditional probability, in which we reveal information about a random event, and this revelation changes our probability that the event happened. For example, if I tell you that a card I just drew is between a 2 and an 8, the odds it is a 6 goes up to 1/7, and the odds it is an ace drops to 0.

We then talked about joint probability, which is the odds two things both happen. For independent events, this just involves multiplying them together. For dependent events, we use the conditional probability formula.

We then talked about Bayes' Theorem, and how Bayesian reasoning is a sort of different way of looking at probability as a confidence level rather than the Frequentist view of probability as chance of an outcome.







Tags:
csci 26
andrew monotone chain
andrew's
convex hull
bridges