Jamba 1.5 Large
Jamba 1.5 Large is AI21 Labs's entry in a crowded field. Context window: 0.256K tokens.
Context
256K
Input
$2.00
Key Specifications
Arena Rank
Not disclosed
Context Window
256K
Input Price
per 1M tokens
$2.00
Output Price
per 1M tokens
$8.00
Parameters
398B (94B active)
Open Source
Best For
About Jamba 1.5 Large
Jamba 1.5 Large, developed by AI21 Labs, is a hybrid model combining the Mamba state-space architecture with traditional Transformer layers, featuring 398 billion total parameters (94 billion active) and a 256K token context window. The novel SSM-Transformer design enables efficient processing of very long sequences while maintaining strong performance on reasoning and generation tasks. The architecture offers better throughput than pure Transformer models at equivalent quality, reducing inference costs for long-context workloads. Priced at $2.00 per million input tokens and $8.00 per million output tokens. As an open-source model, it can be self-hosted for enterprise deployments. Jamba 1.5 Large demonstrates that architectural diversity beyond the dominant Transformer paradigm can yield practical advantages, particularly for applications requiring processing of lengthy legal, scientific, or financial documents.
Pricing per 1M tokens
Input Tokens
$2.00
Output Tokens
$8.00
Compare Jamba 1.5 Large
See how Jamba 1.5 Large stacks up against other leading AI models