Microsoft unveils a new small language model | MarTech
Microsoft introduces cost-effective small language models, starting with Phi-3-Mini with 3.8 billion parameters and trained on 3.3 trillion tokens. [ more ]
Stability AI Releases 1.6 Billion Parameter Language Model Stable LM 2
Stability AI has released pre-trained model weights for the Stable LM 2 language model, a 1.6B parameter model trained on 2 trillion tokens of text data from seven languages.
The model is available in two versions: the base model and an instruction-tuned version called Stable LM 2 Zephyr. [ more ]