Mistral AIReleased January 15, 2025

Mistral Medium

#16 Arena RankUndisclosed parameters

Context

128K

Input

$0.40

Key Specifications

🏆

Arena Rank

#16

📐

Context Window

128K

📥

Input Price

per 1M tokens

$0.40

📤

Output Price

per 1M tokens

$2.00

🧠

Parameters

Undisclosed

🔒

Open Source

No

Best For

Enterprise tasksEuropean languages

About Mistral Medium

Mistral Medium is Mistral AI's mid-tier model offering a balanced combination of performance and cost-efficiency. Built in Europe with strong multilingual support, it handles enterprise tasks, code generation, and structured data extraction competently. With a 128K context window and competitive pricing, it serves as a practical choice for production applications that need reliable performance without the cost of Mistral Large. The model is particularly strong in European languages, making it popular among EU-based organizations prioritizing data sovereignty.

Pricing per 1M tokens

Input Tokens

$0.40

Output Tokens

$2.00

Frequently Asked Questions

What is Mistral Medium?
Mistral Medium is Mistral AI's mid-tier model offering a balanced combination of performance and cost-efficiency. Built in Europe with strong multilingual support, it handles enterprise tasks, code generation, and structured data extraction competently. With a 128K context window and competitive pricing, it serves as a practical choice for production applications that need reliable performance without the cost of Mistral Large. The model is particularly strong in European languages, making it popular among EU-based organizations prioritizing data sovereignty.
How much does Mistral Medium cost?
Mistral Medium costs $0.40 per 1 million input tokens and $2.00 per 1 million output tokens. Pricing is based on token usage, making it cost-effective for both small and large-scale applications.
What is Mistral Medium's context window?
Mistral Medium has a context window of 128K tokens. This determines how much text the model can process in a single request — larger context windows allow the model to handle longer documents, maintain more conversation history, and reason over bigger codebases.
Is Mistral Medium open source?
No, Mistral Medium is a proprietary model available through Mistral AI's API. Proprietary models are typically accessible via API endpoints and offer managed infrastructure, support, and regular updates from the provider.
What is Mistral Medium best for?
Mistral Medium is best suited for: Enterprise tasks, European languages. These use cases leverage the model's specific strengths in terms of capability, speed, and cost-effectiveness within Mistral AI's model lineup.