-
Thomas Bayes (1701 – 1761) was an English statistician and philosopher
-
Bayes is known for formulating a specific case of the theorem that bears his name: Bayes' theorem.
-
In machine learning, Naïve Bayes classifiers are a family of simple "probabilistic classifiers" based on applying Bayes' theorem with strong (naïve) independence assumptions between the features.
Possible k outcomes = {C1, C2, ..., Ck}
i = 1, 2, ..., k
P(Ci) P(x | Ci)
P(Ci | x) = -------------------
P(x)
-
P(A | B)
stands for "the conditional probability of A given B", or "the probability of A under the condition B", i.e. the probability of some event A under the assumption that the event B took place. -
P(A | B) = P (A and B) / P(B)
-
P(B) can not be ZERO since B has happened
-
Conditional probability answers the question: how does the probability of an event change if we have extra information
-
Independent Events: Events A and B are independent whenever
P(A | B) = P(A)
. -
Equivalently, events A and B are independent whenever
P(B | A) = P(B)
-
When two events A and B are independent, we can use the multiplication rule for independent events :
P(A and B) = P(A) x P(B)
-
Lesson slides: Understand conditional probability using scenarios -- 8 slides
-
Conditional Probability, Independence, Bayes’ Theorem -- MIT slides
-
An Introduction to Conditional Probability: 12 minutes video
-
Naive Bayes Classifier - Stanford University Course: 10 minutes
-
Introduction to Naive Bayes Theorem, ML Classification: 10 minutes
-
Multi-Class Text Classification with PySpark