Mistral AI ↗Released April 17, 2024
Mixtral 8x22B
Open Source#16 Arena Rank176B (39B active) parameters
Context
64K
Input
$0.90
Key Specifications
🏆
Arena Rank
#16
📐
Context Window
64K
📥
Input Price
per 1M tokens
$0.90
📤
Output Price
per 1M tokens
$2.70
🧠
Parameters
176B (39B active)
🔓
Open Source
Yes
Best For
Efficient reasoningmultilingualcoding
About Mixtral 8x22B
Mixtral 8x22B is Mistral AI's large mixture-of-experts model that uses a sparse architecture to achieve strong performance while activating only a fraction of its total parameters per token. With 176 billion total parameters but only 39 billion active per forward pass, it delivers efficiency that makes it practical to deploy despite its size. It features a 64K context window and excels at multilingual tasks, coding, and mathematical reasoning.
Built byMistral AI↗
Pricing per 1M tokens
Input Tokens
$0.90
Output Tokens
$2.70
Compare Mixtral 8x22B
See how Mixtral 8x22B stacks up against other leading AI models
Other Mistral AI Models
Other Top Models
Frequently Asked Questions
What is Mixtral 8x22B?▾
Mixtral 8x22B is Mistral AI's large mixture-of-experts model that uses a sparse architecture to achieve strong performance while activating only a fraction of its total parameters per token. With 176 billion total parameters but only 39 billion active per forward pass, it delivers efficiency that makes it practical to deploy despite its size. It features a 64K context window and excels at multilingual tasks, coding, and mathematical reasoning.
How much does Mixtral 8x22B cost?▾
Mixtral 8x22B costs $0.90 per 1 million input tokens and $2.70 per 1 million output tokens. Pricing is based on token usage, making it cost-effective for both small and large-scale applications.
What is Mixtral 8x22B's context window?▾
Mixtral 8x22B has a context window of 64K tokens. This determines how much text the model can process in a single request — larger context windows allow the model to handle longer documents, maintain more conversation history, and reason over bigger codebases.
Is Mixtral 8x22B open source?▾
Yes, Mixtral 8x22B is open source. This means the model weights are publicly available, allowing developers and organizations to download, fine-tune, and self-host the model on their own infrastructure. Open-source models offer greater flexibility and data privacy control.
What is Mixtral 8x22B best for?▾
Mixtral 8x22B is best suited for: Efficient reasoning, multilingual, coding. These use cases leverage the model's specific strengths in terms of capability, speed, and cost-effectiveness within Mistral AI's model lineup.