fromScalac - Software Development Company - Akka, Kafka, Spark, ZIO
1 month agoLast month in AI - January 2026
On January 28th, a 30-person U.S. startup called Arcee AI released Trinity Large, a 400-billion-parameter sparse Mixture of Experts model that challenges Meta's Llama 4 Maverick and Chinese models like Z.ai's GLM-4.5. Trained in just six months for $20 million using 2,048 NVIDIA Blackwell B300 GPUs, Trinity represents a significant achievement in democratizing frontier-grade model development. What sets Trinity apart is its commitment to the Apache 2.0 license-a truly open-source alternative to Meta's proprietary Llama license.