The future could belong to small language models (SLMs) tailored for specific applications & demanding less energy. 'These massive large models are not what the technology was built for... Look at using small, task-specific models,' said Seth Dobrin.
AI's addiction to energy consumption hinders its potential, with predictions that GenAI could consume a quarter of US electricity by 2030. Data centers not only consume energy but also water for cooling.
Using tools like ChatGPT or Claude may lead to significant water usage, with Dobrin noting that every 25 to 50 prompts may utilize half a liter of water through evaporation. New cooling paradigms are necessary.
Collection
[
|
...
]