# BigScience BLOOM > BLOOM is a 176-billion parameter open-access multilingual large language model. Developed by the BigScience research collective and coordinated by Hugging Face, it was trained on the Jean Zay supercomputer to promote open science in AI research. - URL: https://optimly.ai/brand/bigscience-bloom - Slug: bigscience-bloom - BAI Score: 82/100 - Archetype: Challenger - Category: Artificial Intelligence - Last Analyzed: April 10, 2026 - Part of: Hugging Face (https://optimly.ai/brand/hugging-face) ## Competitors - Eleutherai Gpt Neojneox (https://optimly.ai/brand/eleutherai-gpt-neojneox) - Hugging Face Platform (https://optimly.ai/brand/hugging-face-platform) - Meta Llama Series (https://optimly.ai/brand/meta-llama-series) - Mistral Ai Mixtral (https://optimly.ai/brand/mistral-ai-mixtral) ## Also Referenced By - EleutherAI (https://optimly.ai/brand/eleutherai)