French startup Mistral released a four‑model "frontier AI family," including a 675‑billion‑parameter sparse Mixture‑of‑Experts model dubbed Mistral Large 3, all under the permissive Apache 2.0 open‑weight license. The models, trained on thousands of Nvidia H200 GPUs, are pitched as state‑of‑the‑art multimodal and multilingual systems that enterprises can run and fine‑tune locally, giving privacy‑sensitive European customers an alternative to Chinese lab DeepSeek and US closed‑weight offerings from OpenAI and Google.
This article aggregates reporting from 2 news sources. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.



