Settings

Theme

A Simple Explanation of Shannon Entropy

gradiently.io

4 points by atreeleaf 3 years ago · 1 comment

Reader

atreeleafOP 3 years ago

Hello, friends and strangers on the internet. Happy Thanksgiving!!

I'm grateful for the wealth of information out on the internet that's enabled people around the world to learn anything their curiosity leads them to.

In 1948, Claude Shannon, the father of information theory formalized a mathematical framework in quantifying the amount of information needed to accurately send/receive messages, determined by the degree of "uncertainty" a message could contain.

In his paper, he introduced what is known as "entropy"; a concept that permeates the field of statistical learning and data science in building and optimizing machine learning models today.

Here is my interpretation and explanation of the concept. Cheers.

https://gradiently.io/a-simple-explanation-of-shannon-entrop...

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection