
""The thinking around AGI and superintelligence is not just optimistic, but fundamentally flawed," the Allen Institute research scientist and Carnegie Mellon University assistant professor writes in a recent blog post. Dettmers defines AGI as an intelligence that can do all things humans can do, including economically meaningful physical tasks. The problem, he explains, is that most of the discussion around AGI is philosophical. But, at the end of the day, it has to run on something.""
""We have maybe one, maybe two more years of scaling left [before] further improvements become physically infeasible," he wrote. "GPUs maxed out in performance per cost around 2018 - after that, we added one off features that exhaust quickly," he explained. Most of the performance gains we've seen over the past seven years have come from things like lower precision data types and tensor cores - BF16 in Nvidia's Ampere, FP8 in Hopper, and FP4 in Blackwell.""
Current-day processors lack the compute density required to reach artificial general intelligence, and available GPU scaling appears to be approaching physical limits within a few years. Recent GPU performance improvements have relied mainly on lower-precision datatypes and specialized tensor cores (BF16, FP8, FP4), which delivered large throughput gains but are increasingly exhausted. Raw generational increases in computational power are much smaller than marketed. AI infrastructure growth is not keeping pace with the exponentially larger resources needed to obtain linear model improvements. Without new hardware breakthroughs or radically different architectures, further scaling of current approaches will become infeasible.
Read at Theregister
Unable to calculate read time
Collection
[
|
...
]