AI models use less energy on multiplication diet
Briefly

Large language models can achieve 50x energy efficiency by removing matrix multiplication and utilizing custom FPGA hardware, reducing AI's environmental impact significantly.
The energy demand of AI is substantial, with data center power consumption projected to nearly double by 2026, leading to renewed interest in nuclear power due to AI's heavy energy consumption.
The research findings by UC Santa Cruz engineers could provide significant energy savings in AI operations through custom FPGA hardware, further potential improvements are also anticipated.
A billion-parameter Large Language Model running on custom FPGA hardware consumes only 13 watts, showcasing a substantial energy efficiency advancement over conventional models.
Read at Theregister
[
]
[
|
]