DeepSeek has announced a breakthrough in energy efficiency, claiming its AI model uses only one-tenth the computing power of Meta's Llama 3.1. This assertion hints at a profound shift in the environmental impact profiles of AI technologies, especially as tech companies intensify efforts to build expansive AI data centers. The energy required for these centers can compare to that of small cities, generating pollution and heightening climate concerns. However, it remains to be seen whether DeepSeek's efficiency can influence broader industry practices or if existing plans for new data centers will undermine these advancements.
"There's a choice in the matter."
"It just shows that AI doesn't have to be an energy hog," says Madalsa Singh, a postdoctoral research fellow at the University of California, Santa Barbara who studies energy systems.
DeepSeek claims to use roughly one-tenth the amount of computing power as Meta's Llama 3.1 model, upending an entire worldview of how much energy and resources it'll take to develop artificial intelligence.
Much will depend on how other major players respond to the Chinese startup's breakthroughs, especially considering plans to build new data centers.
Collection
[
|
...
]