Tag: moe
All the articles with the tag "moe".
-
Mistral's Mixtral-8x22B Is Free, Open Source, and Beats Llama 3.1 - Download Now
• 1 min readMistral just open-sourced Mixtral-8x22B under Apache 2.0 - 22B params, runs on a single RTX 4090, and crushes proprietary models at 1/10th t
Read more -
Moonshot AI Just Dropped the World's Most Advanced Open-Source LLM - And It's Built for Agents
• 1 min readThis new open-source beast from Moonshot crushes reasoning benchmarks while sipping hardware - time to ditch your bloated closed models?
Read more -
NVIDIA Drops Nemotron 3 Nano: 1M Context MoE That Flies on Your Rig
• 1 min readOpen weights, 4x faster inference, million-token context—NVIDIA's tiny beast is built for agentic workflows you can run locally.
Read more