Type something to search...
AI21: Jamba 1.5 Mini

AI21: Jamba 1.5 Mini

  • 250K Context
  • 0.2/M Input Tokens
  • 0.4/M Output Tokens

Jamba 1.5 Mini is the world’s first production-grade Mamba-based model, combining SSM and Transformer architectures for a 256K context window and high efficiency.

It works with 9 languages and can handle various writing and analysis tasks as well as or better than similar small models.

This model uses less computer memory and works faster with longer texts than previous designs.

Read their announcement to learn more.

Related Posts

Jamba 1.5 Large is part of AI21's new family of open models, offering superior speed, efficiency, and quality. It features a 256K effective context window, the longest among open models, enabling im ...

AI21: Jamba 1.5 Large
Ai21
250K context $2/M input tokens $8/M output tokens

The Jamba-Instruct model, introduced by AI21 Labs, is an instruction-tuned variant of their hybrid SSM-Transformer Jamba model, specifically optimized for enterprise applications.256K Context Win...

AI21: Jamba Instruct
Ai21
250K context $0.5/M input tokens $0.7/M output tokens