by Meta
Meta's 400B MoE model with 128 experts. Strong multilingual, long-context, and instruction-following capabilities.
Llama 4 Maverick is Meta's 400B MoE (Mixture of Experts) model with 128 experts. Open-source with strong multilingual, long-context, and instruction-following capabilities.
| Use Case | Quality |
|---|---|
| General chat | Very Good |
| Code generation | Very Good |
| Multilingual | Very Good |
| Cost-sensitive applications | Excellent |
| Fine-tuning for specific domains | Excellent |
| Privacy-sensitive (local) | Excellent |
Quick tips from the community about what works with Llama 4 Maverick right now.
Sign in to share a tip.
No tips yet. Add a tip for this model.