Ori Goshen, CEO of AI21 Labs, asserts that models with large context windows do not have to be compute-intensive, showcasing this with the release of Jamba.
Transformers and SSMs are the backbone of Jamba, enabling it to perform tasks efficiently and handle large context windows.
Collection
[
|
...
]