Learning the Bitter Lesson
Briefly

The research emphasizes that leveraging computation is essential for AI development. In AI domains like Chess and vision, imposing structure limits model efficacy as computation scales. Historical approaches, such as hand-crafted features in vision, demonstrate how these restrictions hinder progress. Newer techniques, like deep networks, have proven superior by learning features directly from data. A proposed approach involves initially adding necessary structures according to available computational resources and gradually removing them to avoid bottlenecks, revealing how the Bitter Lesson informs both AI research and engineering practices.
"The Bitter Lesson has been learned over and over across many domains of AI research, such as Chess, Go, speech, vision."
"Leveraging computation turns out to be the most important thing and the 'structure' we impose on models often limits their ability to leverage growing computation."
"As computation and data scaled, deep networks that learned features directly from pixels outperformed hand-crafted methods."
"Add structures needed for the given level of compute and data available. Remove them later, because these shortcuts will bottleneck further improvement."
Read at Rlancemartin
[
|
]