Summary
This chapter was about information theory. Although you are less likely to directly use the calculations demonstrated in this chapter compared to the material from other chapters, the concepts and ideas behind information theory can be invaluable. Information-theoretic concepts give us a different way to think about probability, distributions, and what is conveyed when we observe a piece of data. Those concepts are as follows:
- Information theory concerns itself with the communication of signals and the efficiency of encoding those signals.
- The smaller the probability of an event or outcome occurring, the higher the information associated with that event or outcome.
- We measure information on a logarithmic scale.
- The expected information tells us the average amount of information we get from an observation of a random variable. The expected information is more commonly known as entropy.
- Entropy increases with the variance of a distribution, so it quantifies...