Type something to search...
AI21: Jamba 1.5 Large

AI21: Jamba 1.5 Large

  • 250K Context
  • 2/M Input Tokens
  • 8/M Output Tokens
Model Unavailable

Jamba 1.5 Large is part of AI21’s new family of open models, offering superior speed, efficiency, and quality.

It features a 256K effective context window, the longest among open models, enabling improved performance on tasks like document summarization and analysis.

Built on a novel SSM-Transformer architecture, it outperforms larger models like Llama 3.1 70B on benchmarks while maintaining resource efficiency.

Read their announcement to learn more.

Related Posts

Jamba 1.5 Mini is the world's first production-grade Mamba-based model, combining SSM and Transformer architectures for a 256K context window and high efficiency. It works with 9 langu ...

AI21: Jamba 1.5 Mini
Ai21
250K context $0.2/M input tokens $0.4/M output tokens