AI21: Jamba 1.5 Large
- 250K Context
- 2/M Input Tokens
- 8/M Output Tokens
- Ai21
- Text 2 text
- 02 Dec, 2024
Jamba 1.5 Large is part of AI21’s new family of open models, offering superior speed, efficiency, and quality.
It features a 256K effective context window, the longest among open models, enabling improved performance on tasks like document summarization and analysis.
Built on a novel SSM-Transformer architecture, it outperforms larger models like Llama 3.1 70B on benchmarks while maintaining resource efficiency.
Read their announcement to learn more.