
"There's a theory in complexity theory that whenever you find a complex system working in nature, it's usually the output of a very simple system or thing that was iterated over and over. We're seeing this lately in AI research-you're just taking very simple algorithms and dumping more and more data into them. They keep getting smarter. What doesn't work as well is the reverse."
"When you design a very complex system and then you try to make a functioning large system out of that, it just falls apart. There's too much complexity in it. So a lot of product design is iterating on your own designs until you find the simple thing that works. And often you've added stuff around it that you don't need, and then you have to go back and extract the simplicity back out of the noise."
Complex, functioning systems in nature often arise from simple systems that were iterated repeatedly. In engineering and product design, starting with many adjustable parts tends to produce complexity that later must be removed as the effective design emerges. Simplifying designs by questioning requirements and eliminating unnecessary components improves scalability and reliability. Examples include the Raptor rocket engine iterations and the contrast between macOS and iOS user complexity. In AI, simple algorithms fed vast data sets have yielded substantial capability gains. Designing highly complex systems from the outset typically fails because accumulated complexity makes the system fragile.
Read at Naval
Unable to calculate read time
Collection
[
|
...
]