Back to Models
🧠

Mixtral 8x22B

Mistral AIopen-source

Large MoE model. 141B total / 39B active parameters. Strong multilingual.

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0
📝

No reviews yet

Be the first to review Mixtral 8x22B!

Model Info

ProviderMistral AI
Categoryopen-source
Total Reviews0
Avg. Rating0.0 / 5.0

Rating Guidelines

★★★★★Exceptional
★★★★Great
★★★Good
★★Fair
Poor