Skip to main content
MicrosoftReleased April 23, 2024

Phi-3 Mini

Open Source3.8B parameters

Phi-3 Mini is Microsoft's entry in a crowded field. Context window: 0.128K tokens.

Context

128K

Input

Free (open)

Key Specifications

🏆

Arena Rank

Not disclosed

📐

Context Window

128K

📥

Input Price

per 1M tokens

Free (open)

📤

Output Price

per 1M tokens

Free (open)

🧠

Parameters

3.8B

🔓

Open Source

Yes

Best For

Edge deploymentmobileon-device AI

About Phi-3 Mini

Phi-3 Mini, developed by Microsoft, is a compact open-source model with 3.8 billion parameters and a 128K token context window. The model demonstrates that high-quality training data can compensate for small parameter counts, achieving performance comparable to models several times its size on reasoning and coding benchmarks. Its minimal footprint enables deployment on mobile devices, edge hardware, and laptops without GPU acceleration. Phi-3 Mini is designed for on-device AI applications where network connectivity, latency, or data privacy requirements prevent cloud-based processing. Free and open-source, it supports fine-tuning and commercial use. The model has been influential in validating Microsoft's research thesis that data quality and training methodology matter more than raw scale, contributing to the broader industry trend toward efficient, compact models.

Pricing per 1M tokens

Input Tokens

Free (open)

Output Tokens

Free (open)

Frequently Asked Questions

What is Phi-3 Mini?
Phi-3 Mini, developed by Microsoft, is a compact open-source model with 3.8 billion parameters and a 128K token context window. The model demonstrates that high-quality training data can compensate for small parameter counts, achieving performance comparable to models several times its size on reasoning and coding benchmarks. Its minimal footprint enables deployment on mobile devices, edge hardware, and laptops without GPU acceleration. Phi-3 Mini is designed for on-device AI applications where network connectivity, latency, or data privacy requirements prevent cloud-based processing. Free and open-source, it supports fine-tuning and commercial use. The model has been influential in validating Microsoft's research thesis that data quality and training methodology matter more than raw scale, contributing to the broader industry trend toward efficient, compact models.
How much does Phi-3 Mini cost?
Input pricing for Phi-3 Mini is Free (open) per million tokens; output runs Free (open). Token-based pricing means you can scale up or down without a fixed commitment.
What is Phi-3 Mini's context window?
The context window for Phi-3 Mini is 128K tokens. That's the maximum amount of text you can feed into a single prompt, including system instructions, conversation history, and the actual query.
Is Phi-3 Mini open source?
Phi-3 Mini is fully open source. You can grab the weights, run it on your own hardware, and fine-tune it for specific tasks. That flexibility is a big deal for teams with strict data requirements.
What is Phi-3 Mini best for?
The sweet spot for Phi-3 Mini is: Edge deployment, mobile, on-device AI. If your workload fits one of these categories, it's worth benchmarking against alternatives.