AI chip startup Groq secures $640 million in series-D funding to boost its inference cloud services, transitioning from hardware sales to AI infrastructure-as-a-service provider.
Groq's chips promise faster token generation using less energy than GPU-based equipment, attributed to its Language Processing Unit (LPU) that doesn't require high-bandwidth memory.
By connecting hundreds of LPUs with on-die SRAM through fiber optics, Groq claims significant performance gains in token generation compared to GPU systems, consuming much less power.
Collection
[
|
...
]