How DeepSeek's efficient AI could stall the nuclear renaissance | TechCrunch
Briefly

The launch of DeepSeek's R1 model has raised eyebrows as it rivals major AI models from Google and OpenAI with significantly less hardware. The revelation that DeepSeek utilized only 2,048 Nvidia H800 GPUs for training, compared to the vast resources rumored to be consumed by competitors, leads to a reevaluation of AI hardware needs. This shift in efficiency could influence demand for energy and reshape investments in nuclear and natural gas capacity, with companies racing to secure adequate energy supply amidst soaring AI data center consumption predictions.
Nvidia's share price fell 16% as uncertainties loom over the AI hardware market. DeepSeek's efficiency challenges traditional assumptions about compute needs.
AI is predicted to consume as much as 12% of all electricity in the U.S., leading tech companies to invest billions in data center capacity.
Read at TechCrunch
[
|
]