OpenAI is reportedly pivoting from plans to create a factory network for chip manufacturing to an emphasis on developing its own in-house AI chips with TSMC and Broadcom.
The upcoming AI chip, developed with Broadcom, is aimed at running models and could be available by 2026, marking a significant shift in OpenAI's hardware strategy.
Due to chip shortages and high training costs, OpenAI is now moving towards using AMD chips via Microsoft's Azure cloud platform, reducing its heavy reliance on Nvidia.
The exploration of AMD chips for AI training signifies OpenAI's commitment to diversifying its hardware resources and enhancing efficiency in AI model training.
Collection
[
|
...
]