Meta's new Llama models are open-source, with different dataset sizes - 8B and 70B parameters. Trained on 24,000 GPU clusters, they outperform rivals and continue improving over time.
Meta's open-source approach contrasts with competitors like OpenAI, sparking debates on development speed and safety in AI. Concerns linger over the rapid advancement of AI technology.
Collection
[
|
...
]