general

Mixtral 8x22B

Large MoE model with 8 experts, 2 active. Good performance with manageable active parameters (46B).

Still deciding?

Take our quick quiz to find the perfect AI tool for your specific needs.