A Simple Explanation of Shannon Entropy
gradiently.ioHello, friends and strangers on the internet. Happy Thanksgiving!!
I'm grateful for the wealth of information out on the internet that's enabled people around the world to learn anything their curiosity leads them to.
In 1948, Claude Shannon, the father of information theory formalized a mathematical framework in quantifying the amount of information needed to accurately send/receive messages, determined by the degree of "uncertainty" a message could contain.
In his paper, he introduced what is known as "entropy"; a concept that permeates the field of statistical learning and data science in building and optimizing machine learning models today.
Here is my interpretation and explanation of the concept. Cheers.
https://gradiently.io/a-simple-explanation-of-shannon-entrop...