Bayesian classification
Probabilistic classification is a practical way to draw inferences based on data, using statistical inference to find the best class for a given value. Given the probability distribution, we can select the best option with the highest probability. The Bayes theorem is the basic rule to draw inferences. The Bayes theorem allows us to update the likelihood of an event, given the new data or observations. In other words, it allows us to update the prior probability P (A) to the posterior probability P (A|B). The prior probability is given by the likelihood before the data is evaluated and the posterior probability is assigned after the data is taken into account. The following expression represents the Bayes theorem:
Naïve Bayes algorithm
Naïve Bayes is the simplest classification algorithm among Bayesian classification methods. In this algorithm, we simply need to learn the probabilities by making the assumption that the attributes A and B are independent, that's why...