Jump to content
  • entries
    34
  • comments
    178
  • views
    10,912

So you think one bit is the smallest possible portion of information?


Silly Druid

619 views

I've read about this in an old book about Bridge (the card game). The example below is a modified version of the one used in that book.

There are 3 people (including you) sitting by the table, and every one of them gets a card. There are 2 black cards and one red card, and for some reason you want to know who has the red card. The cards are lying on the table, face down. So, how much information do you need to know who has the red card?

By definition, the amount of information (in bits) is log2 of the number of possibilities. For example, a byte has 256 possible values, which means it contains log2 (256) = 8 bits of information. In this case, you have 3 possibilities (you know one of the 3 people has the red card), which means you need log2 (3) ≈ 1.585 bits of information.

Now, you reveal your card and see that it's black. How much information did you gain? Well, about the card itself you gained one bit, because it could be black or red. But about the question "who has the red card" you had 3 possibilities and now you have 2 (one of the two other people has the red card), which means now you need log2 (2) = 1 bit of information. So the information gained is 1.585 - 1 = 0.585 bits. So getting a smaller amount of information than one bit is indeed possible!

Edited by Silly Druid

  • Brohoof 3

6 Comments


Recommended Comments

Certainly one bit is the smallest bit of information that can be stored/shared, however, photons are the smallest possible packets of electromagnetic energy, and I don't get the plural of "photons" and "packets", so I would say the smallest bit of information is a bit that can contain packets of information.

I have written compression algorithms before, and I know that the parity situation of a bit can be used as information, as in, if one bit doesn't exist, it can mean that more bits exists, otherwise, that bit is the only bit. So one bit controls more than one bit of information.

 

EDIT: Just to not confuse, one bit is the existence of electricity, while a non-bit or a '0' in this case, is "nothing". So "nothing" in this case becomes information because it was "nothing".

In the very end, I would say that one bit position is the information, and not the bit itself.

  • Brohoof 1
Link to comment

@Splashee While I can agree that it's hard to imagine (if not impossible) to create a physical object that contains less than one bit of information, when you put that information into the context of reducing the uncertainty about a specific question, then you can say the information gained is less than one bit, like in my example above.

  • Brohoof 1
Link to comment

Different ways to store information in binary using a fixed amount of digits for constant information:

001 = black, black, red
010 = black, red, black
100 = red, black, black

Another:

01 = black, black, red
10 = black, red, black
11 = red, black, black

Yet another:

00 = black, black, red
01 = black, red, black
10 = red, black, black

Having information less than a bit rounds up a bit unless exact number of bits can be represented, so ceil(log2(3))

  • Brohoof 1
Link to comment

@Splashee But if, instead of bits, you use something that can have 3 values, then you don't need to round it in this case. It's not used in practice, but theoretically you can build circuits with three-valued logic.

  • Brohoof 1
Link to comment

In the analog world, like old TVs and Telephones, information was broadcasted or sent as an electronic waveform, usually not recordable, had loss of information, but had room for almost infinite information (like with real numbers).
In the digital domain, the loss of information is not an option. The moment you choose to lose bits, you are emulating analog data using digital bits, and there is information loss. JPEG is a typical proof of this, where a lot of information is lost in favor of saving storage space, but because the frequency domain is easily for us humans to reconstruct in our minds, missing information is acceptable here, so it is the winning method of storing data. That means that the brain in us humans are keeping the necessary lost information so it doesn’t have to be stored as bits. Does the 3 cards example work without the human brain? Try it with some animal that doesn’t understand cards or colors the same way humans do.

I have a 300 MB hard drive that can store this and that much information, but I also have a brain that remember new constructed information from that data on the hard drive. Our brain has the buffer to calculate the loss of information, which is addition of bits required to store information. We can reduce information by using already calculated information.

The example with the cards need to be completely separated from the human factor of remembering information like a buffer or RAM cache in a computer.

I do have a fair collection of old telephones where there are 3 toggle buttons for special functions. They need to be “toggle” to work, since the toggle action is the state that needs to be remembered. Newer telephones had no mechanical toggle in their buttons, and the information is directly lost after a press, unless the toggle of the state is stored somewhere else (in this case, in a relay or cross switch, or a modern computer’s RAM).

The information of the lost bit is stored somewhere else.

  • Brohoof 1
Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Join the herd!

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...