Shannon entropy, or more generally information entropy, is an important concept in information theory, the field of study that concerns the quantification of information used in communication. In thermodynamics and other fields, entropy generally refers to the disorder or uncertainty within a system. Claude Shannon introduced the concept of information entropy in the late 1940s, and its definition is effectively equivalent to the definition used in the field of thermodynamics.
Today, the concept of information entropy has a wide range of uses spanning security, encryption, and even such things as machine learning and artificial intelligence. The mathematical definition for information entropy is as follows:
Here, X is a random variable that has xi possible outcomes. P is the probability. The b in this formula can be one of several values, but is most...