Lightmatter's $400M round has AI hyperscalers hyped for photonic datacenters | TechCrunch
Briefly

The interconnect layers transform racks of CPUs and GPUs into a single machine, meaning faster interconnects directly correlate with accelerated datacenter performance. Lightmatter’s optical interconnect layer significantly enhances synchrony among GPUs, essential for efficient AI model training.
Nick Harris, the CEO, emphasizes the limitations of traditional networking for scaling performance, stating, 'Once you leave the rack, you go from high-density interconnect to basically a cup on a string,' highlighting the challenges faced in conventional infrastructure.
Harris notes that while NVLink is a state-of-the-art platform for interconnecting GPUs, the high demands of modern computing outpace its capabilities, especially due to substantial latency caused by multiple layers of switches required for scalability.
He explains further, 'For a million GPUs, you need multiple layers of switches, and that adds a huge latency burden.' This underscores the urgent need for more efficient networking solutions in datacenters, which Lightmatter aims to provide.
Read at TechCrunch
[
|
]