Small language models are growing in popularity - but they have a "hidden fallacy" that enterprises must come to terms with
Smaller language models like GPT-4o mini are more cost-efficient but won't solve all problems in production.
Podcast: Small Language Models with Luca Antiga
Explore Small Language Models (SLMs) and their significance in AI industry through an interview with Luca Antiga, CTO of Lightning AI on ODSC's Ai X Podcast.
Microsoft unveils a new small language model | MarTech
Microsoft introduces cost-effective small language models, starting with Phi-3-Mini with 3.8 billion parameters and trained on 3.3 trillion tokens.
Stability AI Releases 1.6 Billion Parameter Language Model Stable LM 2
Stability AI has released pre-trained model weights for the Stable LM 2 language model, a 1.6B parameter model trained on 2 trillion tokens of text data from seven languages.
The model is available in two versions: the base model and an instruction-tuned version called Stable LM 2 Zephyr.
6 Small Language Models to Get the Job Done With Ease
Small language models are scaled-down AI models that optimize for less computational power and data, making AI tools more accessible to smaller enterprises and individual developers.