Skip to main content
AI21 LabsReleased March 28, 2024

Jamba 1.5 Mini (SSM)

Open Source52B (12B active) parameters

Jamba 1.5 Mini (SSM) is AI21 Labs's entry in a crowded field. Context window: 0.256K tokens.

Context

256K

Input

/bin/zsh.20

Key Specifications

🏆

Arena Rank

Not disclosed

📐

Context Window

256K

📥

Input Price

per 1M tokens

/bin/zsh.20

📤

Output Price

per 1M tokens

/bin/zsh.40

🧠

Parameters

52B (12B active)

🔓

Open Source

Yes

Best For

Efficient long-context processingthroughput

About Jamba 1.5 Mini (SSM)

Jamba 1.5 Mini SSM, developed by AI21 Labs, is a variant of the Jamba 1.5 Mini model with 52 billion total parameters (12 billion active) and a 256K token context window. The model emphasizes the state-space model components of AI21 Labs' hybrid architecture, optimizing for throughput on long-context workloads. It processes lengthy documents, transcripts, and data files efficiently with linear-time complexity rather than the quadratic scaling of standard Transformer attention. Priced at $0.20 per million input tokens and $0.40 per million output tokens. As an open-source model, it supports self-hosted deployment for organizations requiring maximum control over their inference infrastructure. The SSM-focused design makes it particularly efficient for batch processing of long documents where throughput optimization provides measurable cost savings.

Built byAI21 Labs

Pricing per 1M tokens

Input Tokens

/bin/zsh.20

Output Tokens

/bin/zsh.40

Frequently Asked Questions

What is Jamba 1.5 Mini (SSM)?
Jamba 1.5 Mini SSM, developed by AI21 Labs, is a variant of the Jamba 1.5 Mini model with 52 billion total parameters (12 billion active) and a 256K token context window. The model emphasizes the state-space model components of AI21 Labs' hybrid architecture, optimizing for throughput on long-context workloads. It processes lengthy documents, transcripts, and data files efficiently with linear-time complexity rather than the quadratic scaling of standard Transformer attention. Priced at $0.20 per million input tokens and $0.40 per million output tokens. As an open-source model, it supports self-hosted deployment for organizations requiring maximum control over their inference infrastructure. The SSM-focused design makes it particularly efficient for batch processing of long documents where throughput optimization provides measurable cost savings.
How much does Jamba 1.5 Mini (SSM) cost?
AI21 Labs charges /bin/zsh.20 per 1M input tokens for Jamba 1.5 Mini (SSM), with output at /bin/zsh.40. Competitive with other models in its tier.
What is Jamba 1.5 Mini (SSM)'s context window?
Jamba 1.5 Mini (SSM) supports up to 256K tokens per request. A larger context window allows the model to reason over longer inputs, which matters for document analysis, code review, and multi-turn conversations.
Is Jamba 1.5 Mini (SSM) open source?
Yes — AI21 Labs released Jamba 1.5 Mini (SSM) as open source. That means you're free to deploy it however you want: cloud, on-prem, edge. No API lock-in.
What is Jamba 1.5 Mini (SSM) best for?
AI21 Labs positions Jamba 1.5 Mini (SSM) for: Efficient long-context processing, throughput. Real-world performance will depend on your specific prompts and data, but these are the intended strengths.