
Mistral Large 3 is a powerful open-weight LLM with 41B active params, supporting text, images, and low-precision inference.
The newly announced Mistral Large 3 model is an open-weight large language model featuring 41 billion active parameters out of a total 675 billion, enabling complex multimodal inputs including text, images, and multilingual content. It achieves a notable ranking (#2) among open-source non-reasoning models on the LMArena benchmark, demonstrating strong performance in inference tasks. Its design supports low-precision NVFP4 inference on modern GPUs like NVIDIA A100, H100, and the upcoming Blackwell, enabling more efficient hardware utilization for high-throughput AI workloads. This efficiency is critical as the AI field moves toward larger, multimodal models with feasible deployment costs and wider accessibility. Beyond raw scale, Mistral Large 3 also extends a 14-billion-parameter reasoning variant that scored 85% on the AIME 2025 challenge, indicating its competence in complex reasoning scenarios. Such capabilities are key for advancing applications that require deep understanding and problem-solving. The openness of Mistral Large 3 continues the recent trend of releasing competitive LLMs with transparent weights, fostering research and innovation across the academic and developer communities. This release also contrasts with tech giants that prioritize closed commercial models, highlighting a growing ecosystem of powerful, accessible AI tools.
Source: Radical Data Science