Type something to search...
Liquid: LFM 40B MoE (free)

Liquid: LFM 40B MoE (free)

  • 8K Context
  • 0 Input Tokens
  • 0 Output Tokens
Model Unavailable

Liquid’s 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems.

LFMs are general-purpose AI models that can be used to model any kind of sequential data, including video, audio, text, time series, and signals.

See the launch announcement for benchmarks and more info.

These are free, rate-limited endpoints for LFM 40B MoE. Outputs may be cached. Read about rate limits here.

Related Posts