Top Builders
Explore the top contributors showcasing the highest number of app submissions within our community.
Llama 4
Llama 4 is Meta AI’s newest open-weight model series.
It introduces Mixture-of-Experts (MoE) routing for efficient inference, accepts both text and images natively, and stretches context windows to record-breaking lengths—10 M tokens in the Scout variant. Meta positions Llama 4 as a research-friendly, production-ready alternative to proprietary frontier models, while keeping the code and weights downloadable from its GitHub repos and the official llama.com portal.
| General | |
|---|---|
| Release date | 5 Apr 2025 |
| Developer | Meta AI |
| Type | Open-weight multimodal LLM |
| License | Llama 4 Community License |
| GitHub | meta-llama/llama-models |
Core Features
- Mixture-of-Experts architecture – Each query activates a subset of specialised “experts,” yielding higher throughput per FLOP while scaling to trillions of total parameters (TechCrunch).
- Native multimodality – Models ingest both text and images without external adapters (The Verge).
- Extended context windows – Scout handles up to 10 M tokens; Maverick supports 1 M tokens (llm-stats).
- Multilingual training – Optimised across 200+ languages for global deployments (Data Scientist Guide).
- Fine-tunable & agent-ready – Models ship with recipes for supervised fine-tuning, LoRA, and RAG inside the Llama Cookbook.
Model Variants
| Variant | Active Params | Experts | Total Params | Context Window | Best for |
|---|---|---|---|---|---|
| Scout | 17 B | 16 | 109 B | 10 M tokens | Long-context RAG, document analysis (stats) |
| Maverick | 17 B | 128 | 400 B | 1 M tokens | Coding & reasoning tasks, general chat (Oracle Docs) |
| Behemoth | 288 B* | 16 | ~2 T | TBA | High-end STEM, under training (not yet released) |
Tools & Resources
- Weights & License – Download from the official portal.
- Inference & training code – meta-llama/llama-models.
- Quick-start recipes – Llama Cookbook.
- Stats & benchmarks – Interactive dashboards at llm-stats.com.
- Cloud endpoints – Ready-to-use deployments in Oracle OCI Generative AI.
Ecosystem & Integrations
- Meta AI assistant now runs Llama 4 across WhatsApp, Messenger, Instagram, and web chat (The Verge).
- OCI Generative AI offers managed Scout & Maverick endpoints for enterprise workloads (Oracle Docs).
- Community hosting – Providers such as DeepInfra, Groq, and Together price Llama 4 as low as $0.08 / 1 M input tokens (llm-stats).
- Research & open-source – Thousands of fine-tuned checkpoints already live on Hugging Face; Meta’s annual LlamaCon (29 Apr 2025) spotlights academic collaborations (TechCrunch).
Llama 4 pushes open-weight LLMs into frontier-model territory—combining trillion-scale capacity with permissive licensing. Start experimenting by cloning the GitHub repo, reading the cookbook, or provisioning a managed endpoint on Oracle OCI.
Meta Llama 4 AI technology Hackathon projects
Discover innovative solutions crafted with Meta Llama 4 AI technology, developed by our community members during our engaging hackathons.





