Type something to search...

Pretrained experts

A pretrained generative Sparse Mixture of Experts, by Mistral AI. Incorporates 8 experts (feed-forward networks) for a total of 47B parameters. Base model (not fine-tuned for instructions) - see [Mix ...

Mixtral 8x7B (base)
MistralAI
32K context $0.54/M input tokens $0.54/M output tokens