General statistics
List of Youtube channels
Youtube commenter search
Distinguished comments
About
Lawrence D’Oliveiro
Spark
comments
Comments by "Lawrence D’Oliveiro" (@lawrencedoliveiro9104) on "Harnessing The Power Of Information | Order and Disorder | Spark" video.
46:32 Perpetuating the all-too-common myth that a bit is somehow an indivisible “atom” of information. There is no such thing: it is perfectly possible to have fractional bits of information. If you don’t believe this, then just think about how data compression works. How do you pack a data file of N bits down into M < N bits? It’s possible because the informational content of the original file was in fact no more than M bits. Was this done by looking at the original N bits for those N-M bits that held no information, and simply removing them? No, because the information content is typically spread across all the N bits. Which means each of those original (uncompressed) bits was holding less than 1 full bit of real information. QED.
2
@mralistair737 That’s a simplistic interpretation of how compression works. Real-world compression algorithms are able to take advantage of more subtle statistical properties of the data than that.
1
@mralistair737 No, I’m not talking about lossy compression. Here, let me name some lossless compression algorithms, so you can go and study how they work: LZW, FLATE, FLAC.
1
@mralistair737 Yes, you most certainly can have fractional bits of information. Remember, the amount of information conveyed by an event is inversely proportional to the logarithm of the probability of the event.
1
@mralistair737 You can’t think except with YouTube videos, can you? Here’s the formula: information content in bits I = - log₂ P, where P is the probability of the event. If P = 0.5, then I = 1. A whole bit corresponds to an event with 50% probability. If you want I = 0.5, then just solve for P accordingly.
1
@hans-joachimbierwirth4727 Remember, probabilities are always less than 1. Work out the numbers for yourself.
1
@hans-joachimbierwirth4727 Don’t forget you have to multiply the amount of information from each event by the probability of that event occurring, to get the mean information amount. The result is always maximum when the events are equally likely, and is less when they are not.
1