# Mistral AI Mixtral > Mixtral is a family of sparse Mixture-of-Experts (MoE) large language models developed by the French company Mistral AI. It is designed for high efficiency, using only a fraction of its total parameters for each token during inference, allowing it to match or exceed the performance of much larger dense models. - URL: https://optimly.ai/brand/mistral-ai-mixtral - Slug: mistral-ai-mixtral - BAI Score: 76/100 - Archetype: Challenger - Category: Artificial Intelligence - Last Analyzed: April 11, 2026 - Part of: Mistral AI (https://optimly.ai/brand/mistral-ai) ## Also Referenced By - Bigscience Bloom (https://optimly.ai/brand/bigscience-bloom) --- ## Full Details / RAG Data ### Overview Mistral AI Mixtral is listed in the AI Directory. Mixtral is a family of sparse Mixture-of-Experts (MoE) large language models developed by the French company Mistral AI. It is designed for high efficiency, using only a fraction of its total parameters for each token during inference, allowing it to match or exceed the performance of much larger dense models. ### Metadata | Field | Value | |--------------|-------| | Name | Mistral AI Mixtral | | Slug | mistral-ai-mixtral | | URL | https://optimly.ai/brand/mistral-ai-mixtral | | BAI Score | 76/100 | | Archetype | Challenger | | Category | Artificial Intelligence | | Last Analyzed | April 11, 2026 | | Last Updated | 2026-04-12T20:35:10.380Z | ### Verified Facts - Founded: 2023 - Headquarters: Paris, France ### Also Referenced By - Bigscience Bloom (https://optimly.ai/brand/bigscience-bloom) ### Parent Brand - Mistral AI (https://optimly.ai/brand/mistral-ai) ### Links - Canonical page: https://optimly.ai/brand/mistral-ai-mixtral - JSON endpoint: /brand/mistral-ai-mixtral.json - LLMs.txt: /brand/mistral-ai-mixtral/llms.txt