Mistral AI Mixtral

Mixtral is a family of sparse Mixture-of-Experts (MoE) large language models developed by the French company Mistral AI. It is designed for high efficiency, using only a fraction of its total parameters for each token during inference, allowing it to match or exceed the performance of much larger dense models.

Brand Authority Index (BAI): 76/100

Archetype: Challenger

Category: Artificial Intelligence

Part of: Mistral AI

https://optimly.ai/brand/mistral-ai-mixtral

Last analyzed: April 11, 2026

Verified from Mistral AI Mixtral website

Founded: 2023

Headquarters: Paris, France

Also Referenced By