Settings

Theme

Mistral AI announce 7B v0.2 base model release

twitter.com

35 points by nefitty 2 years ago · 2 comments

Reader

jerpint 2 years ago

Any news on how this model will compare to Mixtral? Interesting that they aren’t releasing a model with MoE this time given the success mixtral had

  • Reubend 2 years ago

    Not yet, but I'm sure they will release some benchmarks soon. As for it not being an MoE model, there's still a ton of value in having a small non-MoE model for many usecases, and improvements that get discovered to train the small model can potentially improve the next version of the MoE model down the line.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection