Probabilistic graphical models
Naïve Bayes qualifies as a very simple probabilistic graphical model, which is commonly visualized as a directed graph for which a vertice is a prior or posterior probability and the edge is a conditional probability.
Given two events or observations X, Y, the joint probability of X and Y is defined as p(X,Y) = p(X∩Y). If the observations X and Y are not related, an assumption known as conditional independence, then p(X,Y)=p(X).p(Y). The conditional probability of event Y given X is defined as p(Y|X) = p(X,Y)/p(X).
It is obvious that conditional or joint probabilities involving a large number of variables (that is, p(X,Y,U,V,W | A,B)), can be difficult to interpret. As a picture worth a thousand words, researchers introduced graphical models to describe probabilistic relation between random variables using graphs [5:1].
There are two categories of graphs and therefore graphical models:
- Directed graphs such as Bayesian networks
- Undirected graphs such...