Blog

What is “Naive” in a Naive Bayes classifier?

Naive Bayes is ‘naive’ because it makes the assumption that independent variables or features of an experiment are independent of each other which is not true in real world situations. Despite this, the Naive Bayes classifier works extremely well in lots of real world situations.

Bayes Theorem:

Bayes’ theorem is used to describe conditional probability. Bayes theorem describes the probability of an event, based on prior knowledge of conditions that might be related to that particular event.

For example, if diabetes is related to age, then, using Bayes’ theorem, a person’s age can be used to more accurately assess the probability that they have diabetes, compared to the assessment of the probability of diabetes made without knowledge of the person’s age.

“Bayes theorem states that posterior probability equals prior probability times the likelihood ratio”

 P(A|B) = \frac{P(B|A) P(A)}{P(B)}

where A and B are events.

  • We are trying to find probability of event A, given the event B is true. Event B is also termed as evidence.
  • P(A|B) is a posterior probability, i.e. probability of event after evidence is seen.
  • P(B|A) is prior probability that is probability before evidence.