AI21: Jamba 1.5 Mini
- 250K Context
- 0.2/M Input Tokens
- 0.4/M Output Tokens
- Ai21
- Text 2 text
- 23 Aug, 2024
Model Unavailable
Jamba 1.5 Mini is the world’s first production-grade Mamba-based model, combining SSM and Transformer architectures for a 256K context window and high efficiency.
It works with 9 languages and can handle various writing and analysis tasks as well as or better than similar small models.
This model uses less computer memory and works faster with longer texts than previous designs.
Read their announcement to learn more.