#inference-hardware

[ follow ]
Artificial intelligence
fromTechzine Global
11 hours ago

OpenAI seeks faster alternatives to Nvidia chips

OpenAI seeks alternative inference chips with larger on-chip SRAM to improve response speed for coding and AI-to-AI communication, aiming for about 10% of future inference capacity.
[ Load more ]