Aya 23 35B
Aya 23 35B is Cohere's entry in a crowded field. Context window: 0.008K tokens.
Context
8K
Input
Free (open)
Key Specifications
Arena Rank
Not disclosed
Context Window
8K
Input Price
per 1M tokens
Free (open)
Output Price
per 1M tokens
Free (open)
Parameters
35B
Open Source
Best For
About Aya 23 35B
Aya 23 35B, developed by Cohere through the Cohere For AI research initiative, is an open-source multilingual model with 35 billion parameters and an 8K token context window supporting 23 languages. The model was developed with contributions from researchers worldwide, focusing on extending quality AI capabilities to lower-resource languages that mainstream models underserve. Aya 23 35B performs well on multilingual benchmarks, particularly for languages in Africa, South Asia, and Southeast Asia where few commercial alternatives exist. Free and open-source, it can be fine-tuned and deployed for language-specific applications without cost. The model represents Cohere's commitment to democratizing AI access globally, providing a foundation for researchers and developers working in languages outside the English-Chinese-European focus of most commercial models.
Pricing per 1M tokens
Input Tokens
Free (open)
Output Tokens
Free (open)
Compare Aya 23 35B
See how Aya 23 35B stacks up against other leading AI models