Settings

Theme

Show HN: Visualize token entropy in a tiny in-browser LLM

tonkotsu-ai.github.io

1 points by derekcheng08 a month ago · 1 comment

Reader

derekcheng08OP a month ago

I built Prism this morning. It runs a tiny 500M parameter LLM in your browser on a given piece of text and visualizes the entropy of the probability distribution computed for each token -- effectively how confident the model is in predicting each token.

I've been wanting to build this for a while. I had a crude version of this when I first started working with LLMs and it really helped with my intuition of how the model worked. When you run it on a block of code, you'll see that the model is unsure when it needs to pick an identifier or start a new line. It's a fascinating glimpse into how models operate -- what is "easy" for them and what is uncertain.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection