'Awesome for the community': DeepSeek open sourced its code repositories, and experts think it could give competitors a scare
Briefly

Challenger AI startup DeepSeek has taken a significant step in the AI landscape by open-sourcing five of its code repositories. This move aims to enhance model transparency and foster community collaboration. By sharing foundational code components, DeepSeek positions itself as a competitive alternative to established firms, offering advanced models at lower costs. Industry experts note that this contrasts sharply with proprietary models from larger companies, and while DeepSeek's openness is notable, it doesn't encompass all their resources. Nevertheless, this initiative is celebrated as a positive development for the AI community.
"These humble building blocks in our online service have been documented, deployed, and battle-tested in production," DeepSeek wrote.
"To be clear, Llama has open weights, not open code - you can't see the training code or the actual training datasets. DeepSeek has gone a step further by open sourcing a lot of the code they use, which is awesome for the community," Alistair Pullen, co-founder and CEO of Cosine, told ITPro.
Read at ITPro
[
|
]