AI21 Labs' new AI model can handle more context than most | TechCrunch
Briefly

Ori Goshen, CEO of AI21 Labs, asserts that models with large context windows do not have to be compute-intensive, showcasing this with the release of Jamba.
Transformers and SSMs are the backbone of Jamba, enabling it to perform tasks efficiently and handle large context windows.
Read at TechCrunch
[
add
]
[
|
|
]