Settings

Theme

The Structure of Neural Embeddings

seanpedersen.github.io

74 points by sean_pedersen a year ago · 4 comments

Reader

jmward01 a year ago

Current embeddings are badly trained and are massively holding back networks. A core issue is something I call 'token drag'. Low frequency tokens, when they finally come up, drag the model back towards an earlier state causing a lot of lost training. This leads to the first few layers of a model effectively being dedicated to just being a buffer to the bad embeddings feeding the model. Luckily fixing this is actually really easy. Creating a sacrificial two layer network to predict embeddings in training (and then just calculating the embeddings once for prod inference) gives a massive boost to training. To see this in action check out the unified embeddings in this project: https://github.com/jmward01/lmplay

  • tomrod a year ago

    Do you have a peer reviewed source you could link on this approach, or is it something you thought of and are experimenting with yourself? I couldn't tell from the LMPlay repo in my skim, and the idea is intriguing

    • jmward01 a year ago

      all my own ideas in there. I was thinking of writing it up more formally, but I am more of a 'think -> build -> next thing' kind of person.

tomrod a year ago

Oh wow, great set of reads. Thanks to @sean_pedersen for posting, looking forward to reviewing this in my closeout this year.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection