Type something to search...
Liquid

Liquid

no content

Liquid's 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems. LFMs are general-purp ...

Liquid: LFM 40B MoE
Liquid
32K context $1/M input tokens $2/M output tokens

Liquid's 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems. LFMs are general-purpose AI models ...

Liquid: LFM 40B MoE (free)
Liquid
8K context $0 input tokens $0 output tokens