China Telecom's AI Research Institute successfully trained the 100-billion-parameter model, TeleChat2-115B, using only domestically produced computing power, showcasing resilience against Western tech sanctions.
TeleChat2-115B was trained on 10 trillion tokens of high-quality Chinese and English corpus, demonstrating significant advancements in AI using domestic resources.
The use of Huawei's Ascend Atlas 800T A2 training server, with its Kunpeng processors, indicates China Telecom's strategic engineering in training AI amidst export restrictions.
Although TeleChat2's 100 billion parameters pale in comparison to models with over 400 billion parameters, it showcases China's capability to innovate within existing constraints.
Collection
[
|
...
]