DeepSeek released DeepSeek V3.1, a 685-billion-parameter open-source AI model made available on Hugging Face with minimal publicity. Early benchmark results reportedly indicate performance comparable to proprietary offerings from OpenAI and Anthropic. The launch could broaden access to frontier AI capabilities and prompt enterprise IT leaders to reconsider procurement strategies that historically favored US vendors. The release coincided with OpenAI publishing open-weight models, a move reportedly influenced by rising competition from Chinese open-source models. Sam Altman warned that the US may be underestimating the pace and scale of China's AI advances and that export restrictions alone are unlikely to ensure long-term protection. An analyst noted the commoditization of raw AI capability and the need for differentiation through trust, governance, and enterprise ecosystems.
Chinese startup DeepSeek has released its largest AI model to date, a 685-billion-parameter model that industry observers say could intensify competition with US players. The model, called DeepSeek V3.1, was made available on the open-source platform Hugging Face this week with little publicity. Despite the quiet rollout, early benchmark results reportedly suggest the model performs on par with proprietary offerings from OpenAI and Anthropic.
Chief Executive Sam Altman told CNBC that rising competition from Chinese open-source models, including those from DeepSeek, influenced the move. Altman also cautioned that the US may be underestimating the pace and scale of China's advances in AI, adding that export restrictions by themselves are unlikely to provide a lasting safeguard. DeepSeek models have long attracted attention from developers and global enterprises for their large parameter sizes and wide context windows.
Collection
[
|
...
]