Eagle 7B is an open-sourced large language model trained on 1.1 trillion text tokens in over 100 languages, outperforming other 7B-parameter LLMs.
The RWKV architecture behind Eagle 7B combines the benefits of Transformers and RNNs, boasting an energy-efficient design and no maximum input context length.