Claude Shannon's Information Entropy | Information Theory Part 12
Channel:
Subscribers:
175,000
Published on ● Video Link: https://www.youtube.com/watch?v=R4OlXb9aTvQ
Entropy is a measure of the uncertainty in a random variable (message source). Claude Shannon defines the "bit" as the unit of entropy (which is the uncertainty of a fair coin flip). In this video information entropy is introduced intuitively using bounce machines & yes/no questions.
Note: This analogy applies to higher order approximations, we simply create a machine for each state and average over all machines!
Other Videos By Art of the Problem
Tags:
entropy
information
information entropy
bit
information theory
claude shannon
measure
language of coins
art of the problem
languageofcoins
math