Settings

Theme

Experiments with Bitnet 1.5 (Ngmi)

huggingface.co

11 points by loser777 2 years ago · 2 comments

Reader

xiphias2 2 years ago

NVIDIA was confident enough to come out with FP4 cards for training, so 16 bit floats are not the best that can be done.

As most hardware speedups over the years came from decreasing precision, I'm sure NVIDIA tries everything to make FP4 work (and then get rid of FP8 multiplies if it can get away with it).

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection