Entropy as expected information
For our “Guess Who” game, there are several attributes that a character can have. Here, I have listed the complete (for this purpose) set of possible attributes:
- Has a hat
- Has a stethoscope
- Has black hair
- Has a computer
- Has a chemical flask
From what we’ve learned about information theory so far, we’ve seen that if I tell you my chosen character has a computer, then you can narrow down the possibilities to a single character. This might suggest that asking if my chosen character has a computer is the most efficient question you can ask me. This isn’t quite true. There is no guarantee that my chosen character does have a computer. To identify the best question to ask, we should look at the expected information you’ll get by asking a question.
How do we calculate the expected amount of information you get from asking a single question? We’ll use a character having a stethoscope...