Ministral 3B
- 125K Context
- 0.04/M Input Tokens
- 0.04/M Output Tokens
- Mistralai
- Text 2 text
- 17 Oct, 2024
Ministral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.