X Releases Back-End Code and Weighting Data for its 'Grok' LLM
Briefly

We are releasing the base model weights and network architecture of Grok-1, our large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.
If an AI is programmed to push for diversity at all costs, as Google Gemini was, then it will do whatever it can to cause that outcome, potentially even killing people.
Read at Social Media Today
[
add
]
[
|
|
]