fromInfoQ
4 days agoHugging Face Introduces mmBERT, a Multilingual Encoder for 1,800+ Languages
Hugging Face has released mmBERT, a new multilingual encoder trained on more than 3 trillion tokens across 1,833 languages. The model builds on the ModernBERT architecture and is the first to significantly improve upon XLM-R, a long-time baseline for multilingual understanding tasks. mmBERT uses a progressive training schedule instead of training on all languages at once. It starts with 60 high-resource languages, expands to 110, and finally includes all 1,833 languages.
Artificial intelligence