Stability AI Releases 1.6 Billion Parameter Language Model Stable LM 2
Briefly

By releasing one of the most powerful small language models to date and providing complete transparency on its training details, we aim to empower developers and model creators to experiment and iterate quickly.
OpenAI showed that language model capability scaled with the number of model parameters, leading to the development of large language models (LLMs) with trillions of parameters. However, the challenges involved in training and hosting these models have led to a trend toward "small language models."
Read at InfoQ
[
add
]
[
|
|
]