Aleph Alpha has introduced a groundbreaking LLM architecture that eliminates tokenizers, allowing for the processing of whole words or single bytes. This innovation addresses the inefficiencies in training multilingual models, particularly for less represented languages. The support from AMD and other industry players enhances its European character. Founder Jonas Andrulis envisions sovereign AI models applicable across various cultures and industries, highlighting the significance of AI that transcends language limitations. The method, known as Hierarchical Autoregressive Transformers (HAT), aims to create efficient, sustainable AI solutions.
Founder and CEO of Aleph Alpha, Jonas Andrulis, emphasizes, "I founded Aleph Alpha with the mission to empower the sovereignty of countries and industries through advanced AI models."
The new architecture, Hierarchical Autoregressive Transformers, enables processing at the level of whole words or single bytes, enhancing efficiency in multilingual LLM applications.
Collection
[
|
...
]