Mistral AIReleased July 18, 2024

Mistral Nemo

Open Source#27 Arena Rank12B parameters

Context

128K

Input

$0.30

Key Specifications

🏆

Arena Rank

#27

📐

Context Window

128K

📥

Input Price

per 1M tokens

$0.30

📤

Output Price

per 1M tokens

$0.30

🧠

Parameters

12B

🔓

Open Source

Yes

Best For

Lightweight tasksdrop-in replacement

About Mistral Nemo

Mistral Nemo is a compact 12B parameter model co-developed by Mistral AI and Nvidia, designed as a high-performance drop-in replacement for smaller models. Despite its size, it delivers performance significantly above its weight class on coding, reasoning, and multilingual tasks. As an open-source model, it can be self-hosted on a single GPU, making it ideal for organizations with limited compute resources or strict data privacy requirements. Its small size enables fast inference and low-cost deployment while maintaining the quality standards of the Mistral model family.

Pricing per 1M tokens

Input Tokens

$0.30

Output Tokens

$0.30

Frequently Asked Questions

What is Mistral Nemo?
Mistral Nemo is a compact 12B parameter model co-developed by Mistral AI and Nvidia, designed as a high-performance drop-in replacement for smaller models. Despite its size, it delivers performance significantly above its weight class on coding, reasoning, and multilingual tasks. As an open-source model, it can be self-hosted on a single GPU, making it ideal for organizations with limited compute resources or strict data privacy requirements. Its small size enables fast inference and low-cost deployment while maintaining the quality standards of the Mistral model family.
How much does Mistral Nemo cost?
Mistral Nemo costs $0.30 per 1 million input tokens and $0.30 per 1 million output tokens. Pricing is based on token usage, making it cost-effective for both small and large-scale applications.
What is Mistral Nemo's context window?
Mistral Nemo has a context window of 128K tokens. This determines how much text the model can process in a single request — larger context windows allow the model to handle longer documents, maintain more conversation history, and reason over bigger codebases.
Is Mistral Nemo open source?
Yes, Mistral Nemo is open source. This means the model weights are publicly available, allowing developers and organizations to download, fine-tune, and self-host the model on their own infrastructure. Open-source models offer greater flexibility and data privacy control.
What is Mistral Nemo best for?
Mistral Nemo is best suited for: Lightweight tasks, drop-in replacement. These use cases leverage the model's specific strengths in terms of capability, speed, and cost-effectiveness within Mistral AI's model lineup.