Skip to main content
AI21 LabsReleased August 22, 2024

Jamba 1.5 Large

Open Source398B (94B active) parameters

Jamba 1.5 Large is AI21 Labs's entry in a crowded field. Context window: 0.256K tokens.

Context

256K

Input

$2.00

Key Specifications

🏆

Arena Rank

Not disclosed

📐

Context Window

256K

📥

Input Price

per 1M tokens

$2.00

📤

Output Price

per 1M tokens

$8.00

🧠

Parameters

398B (94B active)

🔓

Open Source

Yes

Best For

Long documentsenterprise RAGanalysis

About Jamba 1.5 Large

Jamba 1.5 Large, developed by AI21 Labs, is a hybrid model combining the Mamba state-space architecture with traditional Transformer layers, featuring 398 billion total parameters (94 billion active) and a 256K token context window. The novel SSM-Transformer design enables efficient processing of very long sequences while maintaining strong performance on reasoning and generation tasks. The architecture offers better throughput than pure Transformer models at equivalent quality, reducing inference costs for long-context workloads. Priced at $2.00 per million input tokens and $8.00 per million output tokens. As an open-source model, it can be self-hosted for enterprise deployments. Jamba 1.5 Large demonstrates that architectural diversity beyond the dominant Transformer paradigm can yield practical advantages, particularly for applications requiring processing of lengthy legal, scientific, or financial documents.

Built byAI21 Labs

Pricing per 1M tokens

Input Tokens

$2.00

Output Tokens

$8.00

Frequently Asked Questions

What is Jamba 1.5 Large?
Jamba 1.5 Large, developed by AI21 Labs, is a hybrid model combining the Mamba state-space architecture with traditional Transformer layers, featuring 398 billion total parameters (94 billion active) and a 256K token context window. The novel SSM-Transformer design enables efficient processing of very long sequences while maintaining strong performance on reasoning and generation tasks. The architecture offers better throughput than pure Transformer models at equivalent quality, reducing inference costs for long-context workloads. Priced at $2.00 per million input tokens and $8.00 per million output tokens. As an open-source model, it can be self-hosted for enterprise deployments. Jamba 1.5 Large demonstrates that architectural diversity beyond the dominant Transformer paradigm can yield practical advantages, particularly for applications requiring processing of lengthy legal, scientific, or financial documents.
How much does Jamba 1.5 Large cost?
Jamba 1.5 Large costs $2.00 per 1M input tokens and $8.00 per 1M output tokens. You pay only for what you use, which keeps costs predictable.
What is Jamba 1.5 Large's context window?
Jamba 1.5 Large has a context window of 256K tokens. This determines how much text the model can process in a single request — bigger windows mean longer documents and richer conversation history.
Is Jamba 1.5 Large open source?
Yes, Jamba 1.5 Large is open source. The model weights are publicly available, so developers can download, fine-tune, and self-host it. Open-source models give teams more control over data privacy and deployment.
What is Jamba 1.5 Large best for?
Jamba 1.5 Large is best suited for: Long documents, enterprise RAG, analysis. These use cases play to the model's strengths in capability, speed, and cost within AI21 Labs's lineup.