Type something to search...
WizardLM-2 8x22B

WizardLM-2 8x22B

  • 64K Context
  • 0.5/M Input Tokens
  • 0.5/M Output Tokens
Model Unavailable

WizardLM-2 8x22B is Microsoft AI’s most advanced Wizard model. It demonstrates highly competitive performance compared to leading proprietary models, and it consistently outperforms all existing state-of-the-art opensource models.

It is an instruct finetune of Mixtral 8x22B.

To read more about the model release, click here.

#moe

Related Posts

Phi-3 128K Medium is a powerful 14-billion parameter model designed for advanced language understanding, reasoning, and instruction following. Optimized through supervised fine-tuning a ...

Phi-3 Medium 128K Instruct
Microsoft
125K context $1/M input tokens $1/M output tokens

Phi-3 Mini is a powerful 3.8B parameter model designed for advanced language understanding, reasoning, and instruction following. Optimized through supervised fine-tuning and preference ...

Phi-3 Mini 128K Instruct
Microsoft
125K context $0.1/M input tokens $0.1/M output tokens

Phi-3.5 models are lightweight, state-of-the-art open models. These models were trained with Phi-3 datasets that include both synthetic data and the filtered, publicly available website ...

Phi-3.5 Mini 128K Instruct
Microsoft
125K context $0.1/M input tokens $0.1/M output tokens

WizardLM-2 7B is the smaller variant of Microsoft AI's latest Wizard model. It is the fastest and achieves comparable performance with existing 10x larger opensource leading models It ...

WizardLM-2 7B
Microsoft
31.25K context $0.055/M input tokens $0.055/M output tokens