
"AI labs are racing to build data centers as large as Manhattan, each costing billions of dollars and consuming as much energy as a small city. The effort is driven by a deep belief in "scaling" - the idea that adding more computing power to existing AI training methods will eventually yield superintelligent systems capable of performing all kinds of tasks."
"Adaption Labs is building AI systems that can continuously adapt and learn from their real-world experiences, and do so extremely efficiently. She declined to share details about the methods behind this approach or whether the company relies on LLMs or another architecture. "There is a turning point now where it's very clear that the formula of just scaling these models - scaling-pilled approaches, which are attractive but extremely boring - hasn't produced intelligence that is able to navigate or interact with the world," said Hooker."
AI labs are building enormous, Manhattan-sized data centers that cost billions and consume energy comparable to small cities. The dominant industry belief in scaling — adding compute to existing training methods — aims to produce broadly capable, superintelligent systems, but many researchers warn of diminishing returns for large language models. Sara Hooker, a former Cohere VP and Google Brain alumna, co-founded Adaption Labs to pursue AI that continuously adapts and learns from real-world experience with high efficiency. The company has not disclosed technical details and is recruiting across engineering, operations, and design roles.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]