Type something to search...
AI21: Jamba Instruct

AI21: Jamba Instruct

  • 250K Context
  • 0.5/M Input Tokens
  • 0.7/M Output Tokens

The Jamba-Instruct model, introduced by AI21 Labs, is an instruction-tuned variant of their hybrid SSM-Transformer Jamba model, specifically optimized for enterprise applications.

  • 256K Context Window: It can process extensive information, equivalent to a 400-page novel, which is beneficial for tasks involving large documents such as financial reports or legal documents
  • Safety and Accuracy: Jamba-Instruct is designed with enhanced safety features to ensure secure deployment in enterprise environments, reducing the risk and cost of implementation

Read their announcement to learn more.

Jamba has a knowledge cutoff of February 2024.

Related Posts

Jamba 1.5 Large is part of AI21's new family of open models, offering superior speed, efficiency, and quality. It features a 256K effective context window, the longest among open models, enabling im ...

AI21: Jamba 1.5 Large
Ai21
250K context $2/M input tokens $8/M output tokens

Jamba 1.5 Mini is the world's first production-grade Mamba-based model, combining SSM and Transformer architectures for a 256K context window and high efficiency. It works with 9 languages and can h ...

AI21: Jamba 1.5 Mini
Ai21
250K context $0.2/M input tokens $0.4/M output tokens