Conditional independence
We know from statistics that the notion of statistical independence says that the joint probability of two random variables, A and B, is just the product of their (marginal) probabilities. Sometimes, two variables may not be statistically independent of each other to begin with, but observing a third variable, C, might result in them becoming independent of each other. In short, we say that events A and B are conditionally independent given C, and we can express this as:

For example, suppose that J represents the probability of being given a job offer at a particular company and G represents the probability of being accepted into graduate school at a particular university. Both of these might depend on a variable U, a person's performance on their undergraduate degree. This can be summarized in a graph as follows:

When we don't know U, a person's performance on their undergraduate degree, knowing that they were accepted into graduate school might increase our belief...