So you think one bit is the smallest possible portion of information?
I've read about this in an old book about Bridge (the card game). The example below is a modified version of the one used in that book.
There are 3 people (including you) sitting by the table, and every one of them gets a card. There are 2 black cards and one red card, and for some reason you want to know who has the red card. The cards are lying on the table, face down. So, how much information do you need to know who has the red card?
By definition, the amount of information (in bits) is log2 of the number of possibilities. For example, a byte has 256 possible values, which means it contains log2 (256) = 8 bits of information. In this case, you have 3 possibilities (you know one of the 3 people has the red card), which means you need log2 (3) ≈ 1.585 bits of information.
Now, you reveal your card and see that it's black. How much information did you gain? Well, about the card itself you gained one bit, because it could be black or red. But about the question "who has the red card" you had 3 possibilities and now you have 2 (one of the two other people has the red card), which means now you need log2 (2) = 1 bit of information. So the information gained is 1.585 - 1 = 0.585 bits. So getting a smaller amount of information than one bit is indeed possible!
Edited by Silly Druid
- 3
6 Comments
Recommended Comments
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Join the herd!Sign in
Already have an account? Sign in here.
Sign In Now