Meta's Llama models get 350 million downloads
Briefly

The recent 20 million downloads could be seen as an effect of Llama 3.1 update that included a 405 billion parameter model and other variants, which outperformed on benchmarking tests.
Hosted Llama usage by token volume across our major cloud service provider partners more than doubled from May through July 2024 following the Llama 3.1 release, indicating significant adoption.
Meta is actively expanding its network of partners for the Llama models, collaborating with top cloud services like AWS, Azure, and Google Cloud to enhance distribution.
The 405 billion parameter variant of the Llama model is gaining notable traction, reflecting its performance advantages in various benchmarks and increased user engagement.
Read at Computerworld
[
]
[
|
]