Mistral AI announce 7B v0.2 base model release
twitter.comAny news on how this model will compare to Mixtral? Interesting that they aren’t releasing a model with MoE this time given the success mixtral had
Not yet, but I'm sure they will release some benchmarks soon. As for it not being an MoE model, there's still a ton of value in having a small non-MoE model for many usecases, and improvements that get discovered to train the small model can potentially improve the next version of the MoE model down the line.