Claude Shannon's Information Entropy | Information Theory Part 12

Subscribers:
175,000
Published on ● Video Link: https://www.youtube.com/watch?v=R4OlXb9aTvQ



Category:
Guide
Duration: 7:05
136,900 views
2,105


Entropy is a measure of the uncertainty in a random variable (message source). Claude Shannon defines the "bit" as the unit of entropy (which is the uncertainty of a fair coin flip). In this video information entropy is introduced intuitively using bounce machines & yes/no questions.

Note: This analogy applies to higher order approximations, we simply create a machine for each state and average over all machines!







Tags:
entropy
information
information entropy
bit
information theory
claude shannon
measure
language of coins
art of the problem
languageofcoins
math