llmsreview
Models
Leaderboard
Arena
Top Reviews
Back to Models
🧠
Mixtral 8x22B
Mistral AI • open-source
Large MoE model. 141B total / 39B active parameters. Strong multilingual.
Reviews (0)
Discussions
Write Review