Arcee AI, a U.S. startup, has unveiled the Trinity Mini and Trinity Nano Preview, the first models in its new ‘Trinity’ family of open-source Mixture-of-Experts (MoE) models. These models, released under the Apache 2.0 license, represent a significant shift in the open-source AI domain, which has been dominated by Chinese labs like Alibaba and Baidu.
Trinity Mini, with 26 billion parameters, and Trinity Nano Preview, a 6 billion parameter model, showcase Arcee’s innovative Attention-First MoE architecture, emphasizing stability and training efficiency. Trinity Mini’s performance on benchmarks like SimpleQA and BFCL V3 has been notable, demonstrating competitiveness with larger models.
Both Trinity models are available for free download on Hugging Face, empowering developers to modify and fine-tune them to their requirements. Arcee’s strategic focus on model sovereignty and end-to-end training reflects a commitment to reshaping the U.S. open-source AI landscape, challenging the dominance of Chinese models.
With Trinity Large, a 420 billion parameter model set to launch in January 2026, Arcee aims to further establish itself as a key player in frontier-scale open-source AI models.
Source: VentureBeat