Flapping Airplanes and the promise of research-driven AI | TechCrunch
Briefly

Flapping Airplanes and the promise of research-driven AI | TechCrunch
"The scaling paradigm argues for dedicating a huge amount of society's resources, as much as the economy can muster, toward scaling up today's LLMs, in the hopes that this will lead to AGI. The research paradigm argues that we are 2-3 research breakthroughs away from an "AGI" intelligence, and as a result, we should dedicate resources to long-running research, especially projects that may take 5-10 years to come to fruition."
"A compute-first approach would prioritize cluster scale above all else, and would heavily favor short-term wins (on the order of 1-2 years) over long-term bets (on the order of 5-10 years). A research-first approach would spread bets temporally, and should be willing to make lots of bets that have a low absolute probability of working, but that collectively expand the search space for what is possible."
Flapping Airplanes launched with $180 million in seed funding from Google Ventures, Sequoia, and Index and a notable founding team. The lab's primary goal is to find less data-hungry ways to train large models. The project was assessed as Level Two on a commercial intent scale. The effort represents a move away from the dominant compute-first scaling strategy toward a research-first approach that tolerates many long-term, low-probability bets. The research-first approach prioritizes long-running projects over short-term wins to expand the possible solution space for advanced intelligence.
Read at TechCrunch
Unable to calculate read time
[
|
]